Shop Talk: 2020-02-15

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Tracy Boggiano

Notes: Questions and Topics

Data Platform WIT Day

Mala and Kathi Kellenberger are hosting Data Platform WIT Day on March 5th, and Tracy will be one of the speakers. Registration is free via Meetup.

PASS Content via Redgate

Our first topic was around PASS content being available through Redgate. They’ve made PASS Pro content available on the Redgate Hub, and accessing that previously-gated content is now free.

In addition, they have taken control of the PASStv YouTube account. They’re uploading the PASS Summit 2019 Learning Pathway videos as well as PASS Virtual Summit 2020 videos.

We also spent some time on the relative merits of Redgate directly advertising on the PASStv YouTube account. Here’s my stance (after Solomon made good points in chat): if Redgate intends to release PASS content as a good deed, then yes, slap Redgate logos on things, make it available via Redgate’s website, and advertise away. They bought the rights to this content and PASS had the authority to sell them the rights, so no complaints there.

If Redgate intends to foster a new community organization, the better approach is to keep it as vendor-neutral as possible. I don’t think a SQL Server community really succeeds if it’s tied closely into one commercial enterprise—even if that enterprise were Microsoft, but especially not a third-party vendor. It plays havoc with the ability to be a 501(c)(3) (something PASS never was due to its early closeness to Microsoft) and probably drives away other vendors who supported PASS in the past but don’t want to give too much support to their competitor.

Ultimately, that’s Redgate’s choice to make, but I don’t think there’s a way to thread the needle.

Always Encrypted Issues

Our next topic was around Always Encrypted, sent in by @rednelo7625. Here’s a summary of the question:

I have a table with about 5000 rows called [Case] with a column named “Title”.  It was encrypted with an encryption type of Randomized.  After a year, I needed to change the encryption type to Deterministic because of a new query which includes “Title” in group a by clause.  You can only do that if the type is Deterministic. 


So far this week 142 cases have been added to the table.  And I’ve seen 1 user still get the same 500 error when inserting a new record through the website:

Operand type clash: nvarchar(10) encrypted with (encryption_type = ‘RANDOMIZED’, encryption_algorithm_name = ‘AEAD_AES_256_CBC_HMAC_SHA_256’, column_encryption_key_name = ‘CEK_Auto1’, column_encryption_key_database_name = ‘DBNAME’) is incompatible with nvarchar(max) encrypted with (encryption_type = ‘DETERMINISTIC’, encryption_algorithm_name = ‘AEAD_AES_256_CBC_HMAC_SHA_256’, column_encryption_key_name = ‘CEK_Auto1’, column_encryption_key_database_name = ‘DBNAME’).

Also,  now when I rebuild the index on the table I get the following DBCC error:

Msg 206, Level 16, State 2, Line 29

Operand type clash: nvarchar(max) encrypted with (encryption_type = ‘DETERMINISTIC’, encryption_algorithm_name = ‘AEAD_AES_256_CBC_HMAC_SHA_256’, column_encryption_key_name = ‘DBNAME’, column_encryption_key_database_name = ‘DBNAME’) is incompatible with varchar

So, it appears that somewhere the column is still associated as Randomized or not at all…

There are two error messages of note here. The first is that nvarchar(10) clashes with nvarchar(max). The second is that nvarchar(max) clashes with varchar.

For the first message, my best guess is that some stored procedure or .NET calling code has a parameter set to 10 bytes and is trying to update the column. When working with Always Encrypted, the data lengths must match up, so if you define the column as nvarchar(max), all calls from .NET must have a data length of -1 (equivalent to max).

For the second message, that seems kind of weird and is around index rebuilding. My recommendation there was to rebuild the table. Essentially, take a downtime and:

  • Change all of the encrypted columns back to plaintext
  • SELECT * INTO dbo.BackupTableName FROM dbo.CurrentTableName to move all data over to a new table.
  • Drop foreign key constraints or stored procedures which reference the table. Drop the table.
  • Re-create the table with plaintext columns.
  • Re-load the data from dbo.BackupTableName.
  • Encrypt using the combination of deterministic and randomized encryption as desired.
  • Re-create the stored procedures which reference the table. Also re-create any foreign key constraints as needed.
  • Make sure any .NET code is using correct data lengths.

I’m not sure if switching from deterministic to randomized encryption in Always Encrypted is officially supported but there isn’t much information available online about it. Given rednelo’s table size (about 5500 rows), I’d rather just re-create the thing than try to troubleshoot it and possibly see other errors pop up in the future.

Power BI Dashboard in a Day

If you are new to Power BI, the Dashboard in a Day is a good way of understanding some of what is possible with the product. Microsoft has an actively updated list of events, so check one out in your time zone. If you’re already familiar with Power BI, you won’t gain much from it, though—it’s really intended to be an introductory overview of the product.

Shop Talk: 2020-02-01

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Mike Chrestensen

Notes: Questions and Topics

Hail Mike

Welcome to TriPASS’s newest board member, Mike Chrestensen. Mike is taking over for Tom Norman, who has moved down to Texas. Mike will serve out the remainder of Tom’s term and will be up for election at the end of 2021.

Tips for Exam Study

Denise e-mailed us with a question:

My question is about the 70-761 Microsoft Exam and Azure-900 Exam (I do have books),do you know anybody in our group that passed that and would provide tips for further study & preparation for the Exam.  Recommendations for testing tools?

Mike and I both recommended Itzik Ben-Gan’s exam prep book for 761 (and its prior iteration, 461). Frankly, even if you never intend to take the exam, this is a book worth owning because Itzik does a great job of teaching and includes a surprising amount of depth in an exam study book.

As far as AZ-900 goes, this is an introductory exam. My recommendation for that one is to check out the exam skills outline and also try the Microsoft Learn items linked to the exam. You’re being tested on concepts, basic services, and higher-level generalities more than deep knowledge on any given service, and so I wouldn’t stress too hard on this one.

Redgate Acquires PASS Assets

We also covered Redgate’s acquisition of PASS assets. At this point, there isn’t too much information available, but we do know a couple of things:

  • Redgate does not get access to any user or attendee data that PASS previously owned. If you want to keep in the loop, you’ll have to sign up anew.
  • Redgate promises a PASS Summit-like conference, as well as the return of SQL Saturdays. As of right now, the SQL Saturday URL doesn’t direct to a server, but I imagine they’re working on building out a new site. PASS did not give them any code as part of the acquisition, so they’re building anew. Which, frankly, is a good idea when you consider what the SQL Saturday system ran on…
  • Redgate did obtain recordings from various PASS virtual groups and events, and they are working to bring that back online.

If you want to share your thoughts on what the future of the SQL Server community should look like, there’s a forum on SQL Server Central for this.

Mala and I shared our guarded optimism on this, though it will be interesting seeing if Redgate is able to get a consortium of vendors interested in sponsoring a Redgate-led conference.

Removing Features from SQL Server

Our key time sink for the evening was a Brent Ozar blog post which Mala and I both marked on our show agenda. The gist is that Brent tweeted out a question: if you could get rid of one feature in SQL Server, what would it be? For the show, we looked at each of the features that people submitted and gave our takes. Forthwith:

  • Cursors. Nope, keep them. Cursors can solve certain problems that you can’t do set-based. Prior to SQL Server 2012, cursors were the best solution for running totals (assuming you didn’t want to bank on accidental behavior around clustered index sorting). I’ve also used cursors to solve problems which don’t have pleasant set-based solutions. Then there’s the value in using cursors for administrative work. Yeah, you can say to use Powershell or .NET or bash or some other language to do the work, but if I’m in T-SQL and don’t want to build out a bunch of extra code in other languages, this is a good route. Also, for all of the people who said cursors, nobody wanted to get rid of WHILE. This is despite WHILE typically being much slower than properly-tuned cursors. I get that there are people who misuse cursors, and I get that set-based solutions are generally much better. But that doesn’t mean it makes sense to drop the functionality.
  • MERGE. This is one where I’d rather they fix it than dump it. There are too many bugs around MERGE for me to recommend people actually use, but it’d be so nice if it did work as expected.
  • Triggers. Again, no. Those are useful for good reasons. They can be misused, but there’s functionality that I can’t easily replicate without triggers. Or I can, but then other people want to get rid of them as well…
  • VARBINARY(MAX) or IMAGE data types. No to the first, yes to the second. The IMAGE data type is already deprecated; don’t use it anymore. As far as VARBINARY(MAX) goes, it’s quite useful. I can store compressed data, store ML models, and store things which I need access to in SQL Server without having to go elsewhere. Yeah, I get the pain of storing website images in a database—don’t do that. But that doesn’t mean getting rid of the data type is good idea.
  • GEOGRAPHY and GEOMETRY data types. Those are quite useful for spatial data, and if I had a dollar for every time somebody put longitude in the latitude column or vice versa, I’d have several dollars. Also, what are you storing longitude and latitude as? Float? Decimal? VARCHAR? It’s also a lot more pleasant to perform geographical distance calculations with these data types than it is if you’re just storing floats.
  • UNION and DISTINCT. This was a person who’s been burned by slower-than-necessary code. Otherwise, these constructs are so useful that it’s a non-starter.
  • WITH(NOLOCK) and instead force the use of the term READ UNCOMMITTED. I can get behind this, though I shudder at the hundreds of millions of lines of code which would break as soon as that happened…including, I’m sure, Microsoft code.
  • Auto-shrink. Sign me up for this one. I don’t think there’s a single good use for it, and would be happy to see it disappear entirely from the product. If you need to shrink a database, that should be an uncommon occurrence. Uncommon enough that you shouldn’t need to automate it.
  • Auto-close. I do have a good use for auto-close: development instances with large numbers of infrequently-accessed databases. Yeah, that’s a pretty specific use case, but if you’re in a low-memory situation or have a huge number of databases which rarely get used (and don’t mind the spinup cost when you do use them), auto-close is a viable solution.
  • Lightweight pooling. My recollection is that this was useful a couple decades ago, but has been a bad idea since, oh, 2005 or so.
  • Priority boost. This was always a bad idea. I’m okay with canning it.
  • OLE Automation. I haven’t worked with this, to be honest, so I don’t have a strong opinion.
  • CLR. A lot of DBAs dislike CLR, but I think it’s reflexive, like a vampire seeing sunlight. Or a developer seeing sunlight. If you want to argue that CLR has a fairly niche role, okay. As long as you ignore all of the stuff in T-SQL which is really CLR (PARSE, FORMAT, STRING_SPLIT, STRING_AGG, etc.). But I’ve written and used CLR modules to great effect, as it lets me do things in-database, simplifying solutions.
  • Service Broker. My beef with Service Broker is that it never got the UI it deserved. It’s a solid queueing system which works quite well, but it takes a while to understand.
  • Linked servers. The person mentioned that any work which requires linked servers should be done “in the mid-tier.” Sometimes I don’t have a mid-tier. Sometimes I don’t want a mid-tier. Occasionally, I just want to read the contents of a text file, and linked servers are an option (or OPENROWSET or PolyBase, both of which I’m also counting in this). Linked servers are useful for ETL, reference data lookups, migrating data from other data platforms, and plenty more. Yes, they can be slow and I don’t recommend using them frequently in OLTP scenarios. But SQL Server handles more than OLTP.
  • Database Tuning Advisor. This is an interesting one to think about. The kind of shop which uses DTA typically doesn’t know better and often ends up in a hole. But without DTA, they’d be in a different and possibly more painful hole. I don’t like DTA at all and hate having to clean up after it so much, but when you don’t have a DBA and “the database is slow,” I see the allure.
  • Full-text indexing. Meh. I’ve seen in in action a couple of times, seen it fail to work a couple of times (usually because data sets are too large), and have never been too thrilled with it. But it’s definitely better than LIKE '%Something%' searches all over your code.
  • XML indexing. Again, meh. I don’t think I’ve ever seen it in production, but it does work. I guess.
  • FILESTREAM / FileTable. I like FileTable. I think it does a good job of what it’s supposed to do: showing metadata for files in a share. You can’t complain about storing files in the database, but you also don’t need some custom-written app to keep a table in a database up to date with a file share. It’s a niche set of functionality, but again, I’ve benefitted from having this.
  • SSAS / SSRS / SSIS, and change the pricing model. Strongly against. SQL Server has always had an AND pricing model: you get this AND this AND this AND this, all for the same price. By contrast, Oracle has an OR pricing model. If you want feature A, you pay $$. For feature B, you pay an additional $$. For feature C, you pay an additional $$$. Pretty soon, $$$$$$$ adds up to a bunch of dollars. The biggest problem is that splitting out the products wouldn’t really decrease the price; it’d likely kill the other products as cross-subsidies would no longer apply. Today, I pay for SQL Server and get SSAS for free, so I can try it out. Maybe it turns out to be great and I use it so much that I add it to another server (and pay that additional license fee). Or maybe I don’t like it. If business decision-makers had to buy SSAS separately, there’d be a lot less of it and those of us who do use it would be worse off as a result.

Summing it up, I agree with some of these, particularly auto-shrink, priority boost, the IMAGE data type, and lightweight pooling. I can be swayed to a couple more. But most of these are useful tools whose detractors see being misused. It’s not the trigger’s fault that some developer created cross-server, multi-step triggers which call multi-statement UDFs and we update it by hitting the ASMX web service hosted on our SQL Server.

Shop Talk: 2021-01-18

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan

Notes: Questions and Topics

Resource Groups in Azure

The first topic of discussion was around the proper scope of resource groups. Here’s my thought on it:

My general philosophy is that a single resource group contains the whole set of things you’d destroy when it’s time to decommission something.  From that perspective, it’s easy to think of one application per resource group:  as it’s time to tear down an application, you delete the resource group and you’re done.  Even with a hundred applications, I don’t think I’d be too upset about one resource group per set of application services if they’re independent applications.
That said, there are services which go across several applications.  For example, a data warehouse or data lake might be used in several applications as a data source.  Those core services could also get their own resource groups, but that comes with the downside risk that if all of the dependent users of the service get destroyed, the service might still be around, at which point it would be wasting money.  So there’d have to be some sort of architectural mapping across application/resource group boundaries to keep track of dependencies.  The list of resource groups itself would no longer be sufficient to provide all of this information.
Another thought would be, how independent are these applications?  If they’re effectively independent microservices which communicate via expected protocols (like HTTP or gRPC) rather than shared infrastructure, then one resource group per application can sound really viable.  But if some of the applications are tightly coupled together, it may make more sense to put them all into the same resource group, especially if there is a considerable overlap.  Suppose for example that five applications all use the same SQL Server VM to store data, that there are cross-database queries galore, and that each app is dependent on the others for survival–that if you got rid of the other four, any individual one would not be particularly useful.  In that case, it could make more sense to put that cluster of applications into a single resource group.
A related thought to the prior is, how much resource overlap is there?  Do all of the apps use the same Azure App Services?  Are you sharing one Azure Container Repository for everything?  Is there one single Azure Synapse Analytics workspace across all of the applications?  Or maybe one Azure Function app?  The more interconnected services exist, the less useful “one resource group per application” works in practice.  That said, there will probably be some overlap automatically–they may all use the same Azure DevOps implementation and store all of their code in the same GitHub account (though different repositories).  The fact that there is some overlap isn’t necessarily contradictory; it’s the level of overlap which counts.

Solarwinds and Security

@lisanke chimed in with a really good question:

does anyone have or work for an organization that bought Solarwinds and caught a sunburst? Or does anyone buy a single source security mechanism for their Org? Do you now worry that u’ve put all your security eggs into one basket? np if this is too big for an add-on discussion tonight… maybe something to ask for a future talk. Or more general, supply-chain bugs.

This led to a lengthy discussion around risk.

Availability Group Failover Causes

@vijayb5 has a good question as well:

what approach you would think will guide us to determine root cause to find failover of alwayson availability group i have used Failover Detection Utility most of the time it never give proper reason other than showing the possible error around that time. i am interested to find what caused the error.

This is one that I tried to answer, but we’re bouncing it off of Tracy for a better answer in the next episode.

Upgrades and Server Names

Chris Wood hits us with a question, too:

My upgrade problem. I have a SQL2019 CU4 server running on Windows 2019 that had the server renamed. The appropriate drop servername and add servername was run but the registry wasn’t changed to indicate the current servername

When I ran the CU8 upgrade it throw a 574 error running msdb110_upgrade.sql against master and master wasn’t recoverable. We had disk snapshots and recovered to another server. I had heard of using Trace Flag 902 to stop the master being upgraded. I would like to try again sometime but I am now unsure what to do. This is the first major failure in about 20 years of SQL upgrades

In this case, it sounds like there’s an issue with running an upgrade script for SQL Agent. I was working from the assumption that it’s related to the server name change, though Solomon cautions that this isn’t necessarily the case. In any event, Mala’s recommendation on this is to contact Microsoft support. Mine is to try to convince the customer to install on a new server rather than upgrading in-place.

PolyBase and Pushdown

I ended the episode with an issue that a person e-mailed me about. It shows just how difficult it can be to get PolyBase to perform well when running SQL Server to SQL Server. The short version is that a particular query can work fine when returning a result set, but as soon as we try to save the result as a variable, performance tanks due to a predicate not being pushed down to the remote server. It’s a weird one and I’ll have a proper blog post about it someday.

Shop Talk: 2021-01-04

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Tom Norman

Notes: Questions and Topics

Happy Trails, Tom

Our major announcement is that Tom Norman is moving to Texas. I appreciate everything that Tom has done for us at TriPASS and we all wish him well.

As a result, we do have an opening for a board member. If you are a TriPASS member in good standing and interested in being the VP of Membership, please reach out to me.

TriPASS Survey Results

Our first topic of the night was the results of the TriPASS survey. We had enough respondents to get some interesting results. Here’s the quick version:

  • A small majority of TriPASS members who voted said that they would potentially be interested in paid training sponsored by TriPASS, with a price point between $100 and $150, so basically the price of a SQL Saturday pre-con. There were quite a few topics mentioned, but little consistency in topic selection so we’ll have to talk about it as a board to see if there is a good training option we might put on.
  • Thanks to everybody who contributed topics of interest for us to cover. The good news is that we have plans to cover at least 70% or so of the requested topics in the next year. And I might find a way to bring in one or two more.
  • We asked about special interest groups, and we will continue to have two special interest groups in addition to the main meeting. The second Tuesday of the month will continue to be Advanced DBA. For the fourth Tuesday of the month, the title will be Data Science and Business Intelligence. This means that we will include more topics like Power BI, Tableau, and data modeling, but will keep data science and machine learning topics in play as well.
  • Finally, the last question on the survey that I covered pertained to Shop Talk cadence. We have decided to move Shop Talk to run once every two weeks. This means that we will meet every other Monday at 7 PM Eastern.

Thank you to everybody who took the time to fill out the survey.

Power BI End-to-End

Melissa Coates has an updated document for us: the Power BI End-to-End Diagram. Melissa has kept this up to date for several years, and it’s crazy to see just how much Power BI has changed over time. We joke about having a release every other day, but it feels true…

Data Professional Salary Survey, 2021 Edition

Brent Ozar has released the results of the 2021 Data Professional Salary Survey. I haven’t taken the time to look at the data yet, but I have for prior years. It’s a skewed data set—primarily people who know of Brent (or are in that extended network), so it’s going to focus mostly on SQL Server. But if you understand that this isn’t the entire data platform professional population, it’s a very interesting data set for analysis.

Shop Talk: 2020-12-28

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Tracy Boggiano

Notes: Questions and Topics


Our major announcement is that the TriPASS 2020 membership survey is open. If you haven’t filled it out, it will be open for the next couple of weeks.

Data Saturdays

We spent the first segment of the show talking about the post-PASS future. We included a link to Data Saturdays, which is one step toward replacing SQL Saturdays.

I’d also recommend checking out Call for Data Speakers, a service hosted by Daniel Hutmacher to give data platform speakers and conferences a central location, even outside of the SQL Saturday/Data Saturdays paradigm.

Azure Databricks and Azure Synapse Analytics

We spent a good amount of time walking through the differing use cases of Azure Databricks and Azure Synapse Analytics. Microsoft has an architecture guide walking through the use cases. One point of difference I have: I don’t think HDInsight is worth using.

Ivana Pejeva has a great article on the topic. One thing about the article, though, is that it was written a few months back, so Azure Synapse Analytics has changed a bit since.

We also talked about Snowflake competitors, which in the Azure cloud is Azure Synapse Analytics dedicated SQL pools.

Time Series Databases

@rnicrosoft asked a question around analytics when you have 800 million or so key-value pairs. The transactional side solution is typically something like Cosmos DB, where you’re reading and writing single records at a time. But what happens when you need to perform analysis on the data?

One solution is to use something like Azure Synapse Link to pull that data from Cosmos DB into Azure Synapse Analytics and organize the data in a classic fact-dimension model.

But another solution would be to store the data in a time series database like InfluxDB and visualize it in Grafana. Tracy and I have implemented monitoring with InfluxDB, Telegraf, and Grafana, and you can use this for “normal” analytics as well.

Shop Talk: 2020-12-14

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Tracy Boggiano

Notes: Questions and Topics


Our major announcement is that the TriPASS 2020 membership survey is open. If you haven’t filled it out, it will be open for the next couple of weeks.

Stuff’s Broken: Google

We started out by covering the Google outage that morning. Interestingly, several sources I’ve seen say that the outage was “about an hour,” but I know YouTube was down at 4:30 AM Eastern and services weren’t fully back on until approximately 7 AM.

We also talked about redundancy in communication technologies: if you’re using GSuite, have an alternative. If you’re heavily invested in O365, have an alternative. When we’re all remote, having those backup methods of communication may feel redundant but it can be critical. Anders brought up cases where critical people had satellite phones. That answer might be a bit extreme for many companies, but before laughing it off, ask yourself how much it costs the company if nobody can communicate for a few hours. Also, I mentioned the possibility of dual-use systems: if you have meeting software (Zoom, GoToMeeting, Teams, etc.) in addition to a separate provider for chat and e-mail, you can use it as a backstop…unless it’s also down.

Stuff’s Broken: the Rest of the World

Our major topic was all about Solarwinds Orion. This is a pretty big deal, as attackers were able to leverage a vulnerability in a downstream vendor to attack the real targets, which include quite a few US federal government agencies. The CISA (which does not have authority over the Department of Defense) mandated all non-defense agencies remove from the network any server running Solarwinds Orion software and not to bring those machines back onto the network until they get the all clear and wipe those machines.

The Departments of Commerce and Treasury have already announced breaches and Brian Krebs has a bit more, including speculation that it’s hit more than just those two departments.

Lesley Carhart lays out more of the story and points out both that this is not a new style of attack and that this is an extremely difficult vector to protect.

Pluralsight Acquisition

Our last major topic is that Vista Equity Partners is aiming to acquire Pluralsight for $3.5 billion. After an IPO in 2018, Vista would take Pluralsight back private. This still has to go through regulators, but I’m not sure I’d see any major roadblocks there.

We then spent a fair amount of time talking about various training offerings and how the market has shifted. When Pluralsight first came out, it was not the first of its kind, but it was a great way for developers to acquire professional training via video. They expanded this out over the years, but that market is pretty well filled out. Pluralsight has responded by moving more to certification training and project work rather than directed video training, and we’ve seen other training players move in that direction as well.

Shop Talk: 2020-12-07

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Tracy Boggiano

Notes: Questions and Topics


Our major announcement is that the TriPASS 2020 membership survey is open. If you haven’t filled it out, it will be open for the next couple of weeks.

ETL Best Practices

Our big topic was around ETL best practices. We spent quite a bit of time on tooling, processes, ETL vs ELT, and scaling (or lack thereof) in major tools.

Books, Books Everywhere

I showed off my brand-new copy of Itzik Ben-Gan’s T-SQL Window Functions, 2nd Edition. I read the first edition several years back and I’m looking forward to seeing how Itzik has made this even better.

Mala, meanwhile, showed off her copy of Don Jones’s Shell of an Idea, which covers the history of Powershell.

Shop Talk: 2020-11-30

The Recording

The Panelists

  • Kevin Feasel
  • Tom Norman

Notes: Questions and Topics


We had one announcement: TriPASS board elections are open until December 3rd. All active TriPASS Meetup members are eligible to vote.

AWS Kinesis Outage

Tom and I talked a bit about another cloud outage: this time, AWS’s Kinesis service went down during Cyber 5. It had some big effects on downstream customers.

Building a Safe Sequence

The majority of this episode is dedicated to building a safe sequence for when you absolutely need contiguous numbers for ID values. Identity and sequence can, by design, leave gaps. 99% of the time (or more), that’s not an issue. But in the rare case where numeric IDs are part of an audit process, I show a technique to eliminate those gaps, though at the cost of slowing down insert performance considerably.

The notebook is part of my upcoming post-con for Data Platform Summit, so I’m not releasing it right now. But I might release a version of it in the future.

Shop Talk: 2020-11-16

The Recording

The Panelists

  • Kevin Feasel
  • Tom Norman

Notes: Questions and Topics


We had one announcement: TriPASS board elections are coming up this Thursday. They will run for the next three weeks. Because we had one nominee for each position, we’ll have a up-or-down vote on the entire slate.

Upcoming Events

Data Platform Summit is also coming up, from November 30th through December 8th. Registration is open and you can use the discount code TRIANGLESQL for a nice discount.

PASS Summit Thoughts

Tom and I talked about PASS Summit for the entire episode. We had some great conversations in chat as well.

Mala wasn’t able to make it, but she shared her experiences in a blog post that we walked through. I also made mention of Andy Warren’s series on PASS Summit. Andy’s a guy I have a lot of respect for in the community and I’m glad he put in so much time to reflect on Summit from start to end.

After we finished recording, I saw Deb Melkin’s blog post summarizing PASS Virtual Summit, as well as Andy Levy’s. Check those out as well for additional thoughts on Summit.

TriPASS Plans in November

We will not have Shop Talk next Monday, November 23rd, as it is the Monday of Thanksgiving. We also are done with regular data science meetings for the year, so no regular meeting that week.

We will have Shop Talk on November 30th.

Shop Talk: 2020-11-02

The Recording

The Panelists

  • Kevin Feasel
  • Mala Mahadevan
  • Tom Norman

Notes: Questions and Topics


We had one announcement: TriPASS board elections are coming up in mid-November. We have three positions up for election this year: President, VP of Marketing, and Treasurer. All TriPASS members in good standing of eligible to run, and the election will run from November 19th through December 3rd, for terms beginning February 23, 2021 through February 26, 2023. If you’re interested, e-mail (shoptalk at tripass dot org will work) and I can provide more details as needed.


PASS Summit is coming up, from November 10th through the 13th. Registration is still open and you can use the discount code LGDISCWY7 to save some cash. Tom also gave the full-court press on why you should sign up for PASS Virtual Summit this year.

Data Platform Summit is also coming up, from November 30th through December 8th. Registration is open and you can use the discount code TRIANGLESQL for a nice discount.

SSMS and ADS Joined Together

Our big topic for the evening was the about SQL Server Management Studio 18.7, which now forces installation of Azure Data Studio.

Tom, Mala, and I all share fairly similar opinions, so I’ll lay out mine just a little bit. I don’t hate the idea, but I do see two problems with it. First, this is a required installation. Microsoft’s saying that you can uninstall ADS afterward if you choose, but I’d much rather there be an option to install or not rather than having to uninstall later. I use Azure Data Studio pretty regularly and I still don’t like the notion of forcing an installation when you wanted SQL Server Management Studio.

Second, I’m a bit curious about release cadence and bloat. SSMS releases less frequently than ADS, so if I subsequently install 18.7.2, will it try to install an old version of Azure Data Studio over my current version? I don’t know the answer for that and figure we’ll have to wait and see. But we do see from Glenn Berry that ADS is installed as a system installation rather than a user installation, so you might have two copies as a result. You might have a third copy if you’ve used the Insiders track. There’s nothing SSMS-specific about this as you can install multiple copies all on your own, but I’m sure it’ll ruffle a few feathers.

Mala did bring up a good point that I’d like to mention here as well: SQL Server Management Studio is already a collection of installed applications—it brings along with it Profiler, Replication Monitor, Database Tuning Advisor (ugh), the Analysis Services deployment tool, and more. These are all separate applications

But that’s not to say I’m dismissive of the people who don’t want to install ADS, as well as the people who want to make ADS an optional installation rather than mandatory. I understand concerns about corporate compliance, potential security risks involved (you are, after all, installing Python and a very extensible IDE), etc. And given that you don’t need ADS to use SSMS, I’d rather they not require installation. Make it an option and I think a lot of the concern within the community simmers down.

This topic also led us down a rabbit hole of installing SSMS on servers. In real life, I’m not too hard-pressed on the idea because it’s nice to have the tools there in case you need direct server access. But I wanted to push the other side of the argument as hard as I could, and I think SSMS 18.7 makes that “Don’t install SSMS on servers ever” argument a bit easier because it increases the scope of what you need to worry about on a server. Or we just keep copies of 18.6 around for a few years…

TriPASS Plans in November

We will not have Shop Talk next week (November 9th), as this is the start of pre-cons for PASS Summit. We also will not have an Advanced DBA group meeting that week, so expect no TriPASS-specific content unless I decide to do something ad hoc.

We will have Shop Talk on the 16th and Tracy will talk about dbatools & dbachecks; register for that now to get a reminder or just show up when we go live on Twitch.

We will probably not have Shop Talk on November 23rd, as it is the Monday of Thanksgiving. We also are done with regular data science meetings for the year, so no regular meeting that week.

Finally, we will have Shop Talk on November 30th.