Boolean order of precedence in T-SQL

I’ve been seeing a lot people get confused with their where clauses lately. Someone wanted to return where (X = A) and (Y = B or Z = C), but what they get is a totally different result. Sure it’s physically written and viewed as (X = A and Y = B or Z = C), but logically it holds a different meaning.

This demo will help illustrate this for you.

Let’s create our table variable, put a little data inside it, and see what it looks like afterwards.

DECLARE @TestTable TABLE 
(
id int primary key
,color varchar(10)
,number int
,animal varchar(25)
);

INSERT INTO @TestTable 
    (id,color, number, animal)
VALUES  
    (1,'blue', 1, 'monkey'),
    (2,'red', 1, 'monkey'),
    (3,'pink',1,'rhino'),
    (4,'purple', 2, 'monkey'),
    (5,'orange', 2, 'chicken'),
    (6,'black',2,'cat');

select * from @testtable;
result set 1
result set 1

Neat! We got data! Well, let’s try the above scenario.

This was my example earlier: (X = A and Y = B or Z = C)

Translated to where clause language: (number = 1 and color = ‘blue’ or animal = ‘monkey’)

This will translate completely to:

SELECT *
FROM @testtable
WHERE number = 1 AND
color = 'blue' OR animal = 'monkey'

Now here’s the result set once we run the query:

Incorrect boolean return
Incorrect boolean return

But hold up! We are getting three results back, we only expected to get back records 1 and 2. How did record 4 get in there? The answer is order of precedence. To make it perform logically how you wrote it out physically, you have two options.

The way many people would recommend to help protect against user error or accidents is to use parenthesis. These make it declaritive to at least show your logical intent behind the code.

SELECT *
FROM @testtable
WHERE (number = 1)
      AND (
      color = 'blue' OR animal = 'monkey')   

You can re-write the query correctly though to forgo the need of parenthesis.

SELECT *
FROM @testtable
WHERE color = 'blue' OR animal = 'monkey'
AND number=1

If you run the query with the parenthesis fix or the logical fix, your results will be as seen below:

Correct boolean return
Correct boolean return

Regardless of your style or method, your team should have a standard for how they want the SQL styled and formatted. When your code becomes convoluted with unnecessary additions, it may be time to take a step back and evaluate how the code is written and how it can be tuned. You want it to be maintainable, short, and following best practices / ansi standards. The code should be documented, whether implied or explicitly, but of course preferrably both.

I do tend to personally use parenthesis even unnecessarily (like the example above) because I want that code to be implicitly documented. I want the next developer that sees this to know that this is the way I expect the logic to be followed. It can also lend itself to having less likely of a chance of personal error when writing the code.

So when in doubt, check your order of precedence but use your parenthesis with proper discretion and vigor!

Here is the handy reference for the order of precedence in T-SQL.

https://docs.microsoft.com/en-us/sql/t-sql/language-elements/operator-precedence-transact-sql

Review of Itzik Ben-Gan’s Books

I’ll start by saying this isn’t a paid review nor do I have any association with SolidQ publishing.

T-SQL Fundamentals (third edition) was highly recommended as a read for the 70-761 exam along with his other two books, T-SQL Querying and Training Kit (Exam 70-461): Querying Microsoft SQL Server 2012.

Looking at the three books, you can gather the fundamentals book is probably more base skill set and the exam is structured for obviously an exam. So are all three books worth a read? Which ones should you read and in what order?

The 70-461 book is unbelievably deep and well written. While it is ultimately meant to help pass an exam, I think you can take a lot of practical application from it. That said, I think if your goal was not to pass the 70-461 exam, the T-SQL Fundamentals and T-SQL Querying books would be a fine read on their own and you could pass up the exam book. If you had the training exam available for you to read for free however, I would definitely recommend to at least cherry pick the chapters you felt needed additional attention and study time!

T-SQL Fundamentals is definitely the book I’d read first though. This is also the book I wish someone had told me I needed to go and read when I first began work in databases. It touches on the basic key concepts for SQL Server then it gives practical examples and application use. The code is easy to follow and expertly written, no word is wasted and it’s extremely well edited. If you are a manager of junior to intermediate data professionals who primarily use SQL Server, you should make time for them to read this book. (Just my humble opinion.) This is what I’d consider to be, the single best compilation of the must need to know basics that any T-SQL developer will use. If you are on the senior end, it’s a fun read and you may re-remember some lost tidbits or perhaps you had a few holes in your knowledge that you needed patched. I still think it’s worth the read, including the other exam book. But this is definitely geared to the junior to intermediate folk with splatterings of expert advice.

Comparing to T-SQL Fundamentals, T-SQL Querying does not hold back, it’s a tome of information. This would be the book I’d want my team of senior SQL Server data professionals to read. T-SQL Querying may be too advanced for junior to intermediate readers however. I would recommend to try and read through samples or page through it in the store prior to purchasing. It’s a big book and extremely in depth with amazing detail. If you are in a senior SQL role and primarily use SQL Server, this is a great book for you. I’d recommend the Fundamentals first to make sure the first base layer of knowledge is in place, but it is skippable if you have enough experience. The exam booklet would be a great secondary read, otherwise I’d read T-SQL Querying last.

Passed the 70-761 exam, Querying with Transact-SQL!

Hurray, I passed 70-761! That means you can too!

I just thought I’d share briefly how I passed the exam. As a note, the below is how I studied and what I studied, it may or may not be what was on the exam as they change the exams and I cannot discuss what was on the exam directly. Nor can I answer any specifics to the exam, you can find all of this information on Microsoft’s website. I am going to share however, how I got to the point to where I felt comfortable taking and then passing the exam.

I thought the 70-761 was hard but fair, it was both easier and more difficult than I expected it to be. There were three key aspects for me that I’ll apply for the 70-762 exam that helped me to succeed. Figuring out when to schedule the exam, how to study for the exam, and how I should think and answer the questions during the exam.

To schedule the exam, I broke out every topic covered in the description of the test and decided between the things I needed to learn, review, and skip. I then also broke down how much reading and how long each reading session would take. I gave myself an estimated time period for each of the topics I had to learn/review and kept a notebook tracker with the topics, days until the test, and how much reading was left. Based on how much time I thought it would take to learn, I allocated myself 30 minutes of time to study per day and divided by total time I thought I would spend studying by that, and scheduled the test out that many days. Some days I was able to study for 30 minutes while some days I couldn’t study at all. Other days like weekends, I could double down and put in a few hours at once. I would also record what I studied on what day with what resource and about how many hours I put into it.

Scheduling the exam got the fire under my rump to get me moving and motivation. If I didn’t spend the time studying, I’d walk out humiliated and short $160 with a longing regret of what I could have accomplished. So by finding the amount of time I needed and allocating that time for myself to study, I scheduled the exam for 50 days out for 2018-12-08! By actually scheduling the test, I was no longer able to procrastinate, it was now a priority.

On the same page of studying, make sure to study smart. Don’t spend two hours watching videos on joins when you have that down. Spend the time on things you are either unsure of or don’t understand. Topics that you are 100% sturdy on may not be bad to do a brief review to see if you missed anything, but I wouldn’t dedicate more time than reading a chapter in the 70-761 study guide.

For study material, I read the entire 70-461 exam prep book and then read specific sections in the 70-761 exam prep book. Especially on the XML / JSON / Temporal tables, I had heard you need to focus heavily on this from other forums. One note I’ve read is that Microsoft likes to really drill you on things that were new to the version of SQL you are testing for. So I focused heavily on studying things that were new from 2014 to 2016. I read most of and skimmed some chapters in the book T-SQL Fundamentals.

I used additional resources from Cathrine’s website  and MS SQL Tips . I reviewed the XML stairway series  on SQL Central and also watched every video that is recommended on the links I am providing.

JSON and Temporal tables were hard to find good resources for. I ended up reading the entirety of the MSDN tech notes on Temporal tables. I found learning XML and JSON the easiest by practicing problems I’d think of in Adventureworks, otherwise only reading the material did not stick as well. Anything I haven’t used regularly I made sure to read on, watch videos, and practice. I would also search stack overflow for problems associated to topics I wasn’t clear on and try to solve them or at least read up on them. If you aren’t feeling overly confident in an area, try looking for another online resource. If you can find sites that can test your knowledge in areas, that is a fantastic approach to realizing what areas you are lacking in information. Being able to explain something to someone is also a good indicator of how well you understand something.

I would also listen to podcasts and videos while running each morning, but I honestly don’t think that helped much for the test. For topics I found the hardest, I’d start by reading the prep chapter in the exam book. Then I’d look for some reputable and good videos on the topics, read an additional online blog or two, and then I would create practice problems. If I still felt unsure, I’d look for a podcast on the topic to listen to loosely during a free time like driving or running, and then I would also look for flash cards and quizzes online. The most beneficial study tool was using technology day in and day out, otherwise the 70-461 and 70-761 exam books were my bread and butter. Quizzes and flashcards helped me a lot to identify the areas I needed to focus on learning again.

Everyone else has said this about the test but I’ll say it too.

Use your test taking abilities.

If you can go back to a question that is taking you a long time, mark it for review and continue with the test and go back to it at the end. I used every second of the 2 1/2 hours of the exam. I don’t know if having more time would have necessarily changed my score, but I feel 30 more min would have made me felt more comfortable and less rushed.

If you can narrow down the answers you have to pick from, do that at the very least even if you are stumped by the question. From the narrowed down answers you can at least guess.

Don’t leave any answer blank and try to answer them all.

If I were to do it again, I’d probably do more flashcards on definitions of terms in my studying and will try that for 70-762. If I were really uncertain, I’d probably take the official pre-test. I would recommend the 70-461 book just because there is so much good information inside of it, I only bought the 70-761 because it had JSON and Temporal Tables. I’d suggest you could go with the 70-761 + T-SQL Fundamentals instead of the 70-461; or you can go with the 70-461 and the Temporal Tables section in the T-SQL Fundamentals while also adding some supplemental JSON training since that is lacking in the 461 exam book. My test will probably not be the same as you will have and you need to look over the content you can be tested on via the Microsoft site for the cert. Good luck and I’ll see you on the other side of 70-762 shortly!

My favorite resources:

https://www.cathrinewilhelmsen.net/2015/01/28/preparing-for-and-taking-exam-70-461-querying-microsoft-sql-server-2012/

https://www.mssqltips.com/sqlservertip/4644/sql-server-exam-70761-study-material-for-querying-data-with-transactsql/

https://www.amazon.com/T-SQL-Fundamentals-3rd-Itzik-Ben-Gan/dp/150930200X

https://www.amazon.com/Training-70-461-Querying-Microsoft-Server/dp/0735666059/ref=sr_1_1?ie=UTF8&qid=1513628070&sr=8-1&keywords=70%3D461

https://www.amazon.com/Exam-70-761-Querying-Data-Transact-SQL/dp/1509304339/ref=sr_1_1?s=books&ie=UTF8&qid=1513628083&sr=1-1&keywords=70-761

http://www.accelerated-ideas.com/70761-practice-test-questions.aspx#.WeeTWVtSypo

https://www.mssqltips.com/sqlservertip/4015/introducing-json-for-sql-server-2016/

https://www.mssqltips.com/sqlservertip/4073/sql-server-2016-advanced-json-techniques--part-1/

https://docs.microsoft.com/en-us/sql/relational-databases/json/json-data-sql-server

https://www.mssqltips.com/sqlservertip/3680/introduction-to-sql-server-2016-temporal-tables/

https://docs.microsoft.com/en-us/sql/t-sql/queries/select-group-by-transact-sql

https://www.mssqltips.com/sqlservertip/1958/sql-server-cross-apply-and-outer-apply/

Home VPN’s, are you updating your IP securely?

Normally I blog about SQL Server, but I have found this (and one more coming) answers to questions that I could not find on the internet. So I had to share!

As many folks who start dabbling in home VPN setups realize or had known prior (I’m a database guy, not a network admin!), you need a DDNS provider. Your home IP will probably change, so if you wanted to connect to your home IP, it may work for a little while, but it may change unexpectedly. Having it update every few minutes with a provider so they can give you a static name to use that maps automatically to your new IP is the solution to keep it working consistently.

There are many benefits to running a home VPN and using a DDNS service, but the benefit I was after is blocking all adds on my phone. Not just blocking them from being seen, but blocking them from even starting to be downloaded. I run a PiHole at home that blocks ads at the DNS level and is my DNS server. So putting my phone on my home network at all times lets me save data AND block ads EVERYWHERE. If that wasn’t great enough, you can access home files and will have a secure connection to use if you are on a WiFi connection you may not trust.

After dabbling and following some fantastic tutorials as seen by Lauren Orsini on ReadWrite.com and Sam Hobbs on Samhobbs.co.uk I made a lot of progress and got to where I needed an application (DDClient) to update my IP to my DDNS provider and I ran into a problem. Sure it was working, my IP was being updated. But how could I be sure that I was connecting over SSL? Was I transmitting my username and password over plain text across the internet? The answer is, yes.

Here’s the format for a typical DDClient config file:

daemon=60
syslog=yes
mail=root
mail-failure=root
pid=/var/run/ddclient.pid .
ssl=yes
use=web, web=myip.dnsdynamic.org
server=www.dnsdynamic.org
login=USERNAME
password=PASSWORD
server=www.dnsdynamic.org, \
protocol=dyndns2 \
YOUR DOMAIN GOES HERE

Looks fine enough right? Well, it’s not.

If you run this:

sudo ddclient -verbose -debug -noquiet -query

You will quickly see you are connecting over HTTP, not SSL. Even though we have the value ssl=yes, it SHOULD be forcing HTTPS, but we can clearly see our connections are in plain text floating around the internet.

use=web, web=loopia address is IPADDRESSISHERE
CONNECT: myip.dnsdynamic.org
CONNECTED: using HTTP
SENDING: GET / HTTP/1.0
SENDING: Host: myip.dnsdynamic.org
SENDING: User-Agent: ddclient/3.8.2
SENDING: Connection: close

A lot of times I solve problems when I come back to them with a fresh brain and a few cups of coffee. So after a reminder about my problem, I was able to figure out that if I add a simple eight more characters, we would no longer fly through internet tubes seen, but we would be secured.

I changed the line use=web, web=myip.dnsdynamic.org to use=web, web=https://myip.dnsdynamic.org and here’s the new output from the query:

use=web, web=loopia address is IPADDRESSISHERE
CONNECT: myip.dnsdynamic.org

The verification of cert '/C=US/O=GeoTrust Inc./CN=RapidSSL SHA256 CA/CN=www.dnsdynamic.org'
failed against the host 'myip.dnsdynamic.org' with the default verification scheme.

THIS MIGHT BE A MAN-IN-THE-MIDDLE ATTACK !!!!

To stop this warning you might need to set SSL_verifycn_name to
the name of the host you expect in the certificate.

CONNECTED: using SSL
SENDING: GET / HTTP/1.0
SENDING: Host: myip.dnsdynamic.org
SENDING: User-Agent: ddclient/3.8.2
SENDING: Connection: close

Success! I’m not too concerned about the cert issue, if you look, it’s the same subdomain listed. So the query above shows you how you can verify you are connecting over SSL and the config file adjustment with HTTPS resolves your SSL connection.

What if I edit a View in use?

There are times we need to modify a view in production, we can use tools like sp_who2 / sp_whoisactive or use the DMV’s and Extended Events ourselves to see who’s using the views at the moment. But what happens when we need to modify the view but it’s currently in use?

Modifying the underlying table for the view should not make an impact when running alter view. This is not true to users trying to access the view.

If a view currently used is modified by using ALTER VIEW, the Database Engine takes an exclusive schema lock on the view. When the lock is granted, and there are no active users of the view, the Database Engine deletes all copies of the view from the procedure cache. Existing plans referencing the view remain in the cache but are recompiled when invoked.

Source

Exclusive locks:
Used for data-modification operations, such as INSERT, UPDATE, or DELETE. Ensures that multiple updates cannot be made to the same resource at the same time.

Schema locks:

Used when an operation dependent on the schema of a table is executing. The types of schema locks are: schema modification (Sch-M) and schema stability (Sch-S).

What it does:
During the time that it is held, the Sch-M lock prevents concurrent access to the table. This means the Sch-M lock blocks all outside operations until the lock is released.

What happens:
User 1 runs a query.
User 2 alters the view.
User 3 tries to query the view.
User 1 finishes query, User 3 finishes query, User 2 now establishes a schema lock preventing any other transactions until no active users are using the view.

I ran a test on my database with three concurrent windows. One would select * from the view 10,000 times, another would insert into the view 10,000 times, and the other would update the base table 1000 times. While these ran, I would run the alter view script. It behaved in a manner as I described with the scenario above.

None of these procedures were overly heavy or time consuming, basically thousands of micro transactions. The alter wedged itself in as fast as it could and altered the view between transactions when it could establish a lock. Due to this view and tables being small, I could not see the difference. Your environment will vary based on data sets.

I think the primary risk is raising the time it could take to run or complete transactions on the view. The other risk is modifying a view can break other things if they were not written to specify columns and were maintained with * instead. Other thoughts, you could be removing columns in the view in use elsewhere.

Auditing your SQL Servers

Below is what I recommend to my clients to evaluate the security of their infrastructure quarterly, although more frequently is always better. Typically with this set of scripts, you’ll want to either compile the data into workbooks, flat files, or at least historic tables so you can track trends and changes.

We begin with two queries; one brings back the SQL Server Name and Instance so we know what we are evaluating and then we check the version of SQL Server which in itself has its own security repercussions based on service pack and year.

SELECT @@SERVERNAME AS 'SQL Server Name and Instance'
SELECT @@VERSION AS 'SQL Server Version'

The next script brings us a simple select of SQL Logins on the server. It’s a nice view of what logins exist, is our policy correctly enforced, do they have expired passwords, are the accounts locked out, and other helpful tidbits. Some additional reading on the returned records can be found here: https://msdn.microsoft.com/en-us/library/ms345412.aspx

SELECT  name ,
        Is_Policy_Checked ,
        Is_Expiration_Checked ,
        LOGINPROPERTY(name, 'IsMustChange') AS Is_Must_Change ,
        LOGINPROPERTY(name, 'IsLocked') AS [Account Locked] ,
        LOGINPROPERTY(name, 'LockoutTime') AS LockoutTime ,
        LOGINPROPERTY(name, 'PasswordLastSetTime') AS PasswordLastSetTime ,
        LOGINPROPERTY(name, 'IsExpired') AS IsExpired ,
        LOGINPROPERTY(name, 'BadPasswordCount') AS BadPasswordCount ,
        LOGINPROPERTY(name, 'BadPasswordTime') AS BadPasswordTime ,
        LOGINPROPERTY(name, 'HistoryLength') AS HistoryLength ,
        Modify_date
FROM    sys.sql_logins; 

This next query brings us a high level overview of what the logins have access to instance wide. It’s a great check for who is sysadmin or not.

SELECT  Name ,
        DenyLogin ,
        HasAccess ,
        SysAdmin ,
        SecurityAdmin ,
        ServerAdmin ,
        SetupAdmin ,
        ProcessAdmin ,
        DiskAdmin ,
        DBCreator ,
        BulkAdmin ,
        UpdateDate
FROM    syslogins
ORDER BY name;

We then break this down to how they have permissions. Is it a SQL or Windows account? What type of permissions are adjusted and what are our datestamps on these logins? Those questions are answered with this script, it’s a more granular approach to the query we previously looked at.

;WITH    ServerPermsAndRoles
          AS ( SELECT   spr.name AS login_name ,
                        spr.type_desc AS login_type ,
                        spm.permission_name COLLATE SQL_Latin1_General_CP1_CI_AS AS permission_name ,
                      --  CONVERT(NVARCHAR(1), spr.is_disabled) AS is_disabled ,
                        'permission' AS permission_type ,
                        spr.create_date AS create_date ,
                        spr.modify_date AS modify_date
               FROM     sys.server_principals spr
                        INNER JOIN sys.server_permissions spm ON spr.principal_id = spm.grantee_principal_id
               WHERE    spr.type IN ( 's', 'u' )
               UNION ALL
               SELECT   sp.name AS login_name ,
                        sp.type_desc AS login_type ,
                        spr.name AS permission_name ,
                       -- CONVERT(NVARCHAR(1), spr.is_disabled) AS is_disabled ,
                        'role membership' AS permission_type ,
                        spr.create_date AS create_date ,
                        spr.modify_date AS modify_date
               FROM     sys.server_principals sp
                        INNER JOIN sys.server_role_members srm ON sp.principal_id = srm.member_principal_id
                        INNER JOIN sys.server_principals spr ON srm.role_principal_id = spr.principal_id
               WHERE    sp.type IN ( 's', 'u' )
             ),
        MapDisabled
          AS ( SELECT
          DISTINCT      SP.NAME ,
                        SP.IS_DISABLED
               FROM     SYS.server_principals SP
                        FULL JOIN sys.server_role_members RM ON SP.PRINCIPAL_ID = RM.MEMBER_PRINCIPAL_ID
               WHERE    SP.type IN ( 's', 'u' )
             )
    SELECT  ServerPermsAndRoles.Login_Name ,
            ServerPermsAndRoles.Login_Type ,
            ServerPermsAndRoles.Permission_Name ,
            ServerPermsAndRoles.Permission_Type ,
            ServerPermsAndRoles.Create_Date ,
            ServerPermsAndRoles.Modify_Date ,
            MapDisabled.Is_Disabled
    FROM    ServerPermsAndRoles
            LEFT JOIN MapDisabled ON MapDisabled.name = ServerPermsAndRoles.login_name
    ORDER BY MapDisabled.is_disabled DESC ,
            ServerPermsAndRoles.login_name;

Now we see what those logins are mapped to and what permissions and what databases are impacted. You can see how many people have X, Y, or Z access to your databases and what’s mapped where.

DECLARE @DBRolePermissions TABLE(
 DatabaseName varchar(300), 
 Principal_Name sysname, 
 Login_Name sysname NULL, 
 DB_RoleMember varchar(300), 
 Permission_Type sysname);

INSERT INTO @DBRolePermissions
EXEC sp_MSforeachdb '
 SELECT DISTINCT ''?'' AS DatabaseName, users.Name AS UserName, suser_sname(users.sid) AS Login_Name, 
 roles.Name AS Role_Member_Name, roles.type_desc
 FROM [?].sys.database_role_members r 
 LEFT OUTER JOIN [?].sys.database_principals users on r.member_principal_id = users.principal_id
 LEFT OUTER JOIN [?].sys.database_principals roles on r.role_principal_id = roles.principal_id
   --JOIN [?].sys.database_permissions Action on Action.grantee_principal_id = users.principal_id'

INSERT INTO @DBRolePermissions
EXEC sp_msforeachdb '
 SELECT DISTINCT ''?'' AS DatabaseName, users.Name AS UserName, suser_sname(users.sid) AS Login_Name, 
 r.Permission_Name AS DB_RoleMember, r.class_desc
 FROM [?].sys.database_permissions r 
 LEFT OUTER JOIN [?].sys.database_principals users on r.grantee_principal_id = users.principal_id
 WHERE r.class_desc = ''DATABASE'''

SELECT DISTINCT Principal_Name, Login_Name, DatabaseName, DB_RoleMember AS Permission_Name
FROM @DBRolePermissions 
WHERE  Permission_Type <> 'DATABASE'
ORDER BY Principal_Name, DatabaseName, DB_RoleMember;

After evaluating who has access to what, I like to check and verify what databases are encrypted. This is a quick check to see what is and isn’t encrypted at rest and the strength setup.

USE master;
GO
SELECT db.name,
       db.is_encrypted,
       dm.encryption_state,
       dm.percent_complete,
       dm.key_algorithm,
       dm.key_length
FROM sys.databases db
     LEFT OUTER JOIN sys.dm_database_encryption_keys dm ON db.database_id = dm.database_id;

The last check I do is seeing what the linked server and remote server permissions are like to keep tabs on our other endpoints.

SELECT ss.server_id 
          ,ss.name 
          ,'Server ' = Case ss.Server_id 
                            when 0 then 'Current Server' 
                            else 'Remote Server' 
                            end 
          ,ss.product 
          ,ss.provider 
          ,ss.catalog 
          ,'Local Login ' = case sl.uses_self_credential 
                            when 1 then 'Uses Self Credentials' 
                            else ssp.name 
                            end 
           ,'Remote Login Name' = sl.remote_name 
           ,'RPC Out Enabled'    = case ss.is_rpc_out_enabled 
                                   when 1 then 'True' 
                                   else 'False' 
                                   end 
           ,'Data Access Enabled' = case ss.is_data_access_enabled 
                                    when 1 then 'True' 
                                    else 'False' 
                                    end 
           ,ss.modify_date 
      FROM sys.Servers ss 
 LEFT JOIN sys.linked_logins sl 
        ON ss.server_id = sl.server_id 
 LEFT JOIN sys.server_principals ssp 
        ON ssp.principal_id = sl.local_principal_id;

To bring it all together, it’s important to generate this data as a snapshot in time so you can compare it or write automated reporting to compare off this data so you can see how your environment is changing and who can access what. When in charge of a SQL Server, you need to keep track of who can do what and ensure the proper folks have proper permissions so you don’t run into compliance issues and minimize user impact on the databases.

Who changed my data?

Odds are that if you are reading this; someone, somewhere, somehow… has changed your data. Odds are also, you may be out of luck figuring out what changed and who did it. The good news is that the lesson has been learned and you can take action now to prevent this in the future.

SQL Server by default does not automatically track who does what to your tables. It does track the made to your database, but it won’t tell you “who done it?”. You can however use the transaction logs with extreme painstaking patience to reveal what changes were made.

There are features such as Change Data Tracking and Change Data Capture, this can potentially give you some better insight to the changes as they occur. Unfortunately, when it looks at who made the changes, it is only going to post the principal account running CDC/CDT and it will not reveal the culprit.

CDC / CDT

There is real-time tracing with SQL Profiler, but this creates extreme load on the server and it does not reveal the past. You can utilize SQL’s Dynamic Management Views and Extended Events to capture SQL in real time along with wait types, execution plans, and much more in depth information that doesn’t put as much load on the system. Sp_whoisactive does exactly this! Thank you Adam! So I run this every few minutes to capture data and do analysis on what my long running queries are and what my wait types look like, but more importantly on topic, it captures who ran what query and when. I run other scripts that I will eventually load to github and post for everyone, as well as a more in detailed view on how I performance tune my T-SQL.

The problem with real-time analysis, is that you have to catch it in real-time. There’s the chance the SQL ran too fast or sp_whoisactive was not running at the same time. This leads us to the only real solution, triggers.

I won’t dive into HOW to do this, because that is reinventing the wheel, but I will leave this resource that covers creating an audit trail to figure out what changed and by whom and when.