Takes as Long as it Takes SQL, Break on Me, I have a DAC…

The DAC, what is it? It is the Dedicated Administrator Console. Basically it is a way to connect to your SQL Server when all the other connections are tied up. But it takes a little bit of pre-planning so that you can use it when things go bad with your SQL Server.  Let’s enable it so you can test using it and know that it is there in the future.


EXEC sp_configure 'remote admin connections', 1;
GO
RECONFIGURE
GO

I was always afraid to use the DAC because I thought I had to use the command line and I am terrible with cmd. But did you know, you can connect to it through Management Studio? Now I can feel right at home when I trouble shoot a sick server.  To connect to a server using the DAC put “admin:” in front of the name of the server like this:

DAC

I use my elevated windows login for this point, but you can use what ever sys admin account that you have.

There are a few words of caution. You can only have one connection to your DAC on each server at a time. So don’t try to connect Object explorer to it. I have SQLPrompt and noticed that my server would send an alert on Severity Error 20 because SQL Prompt was trying to connect at the same time and was failing. Just be aware that you might get an error, but if everything is correct you can look down at the connection at the bottom of the query window and see you are connected.  If someone else is using the DAC, you won’t be able to use the connection, so it is a good idea to always close it when you are done.

Kendra Little b|t has a fantastic post for Brent Ozar Unlimited about the DAC and most importantly to me, how to tell who is connected.  This is the code that she wrote to help identify who is connected:


SELECT
CASE
WHEN ses.session_id= @@SPID THEN 'It''s me! '
ELSE '' END
+ coalesce(ses.login_name,'???') as WhosGotTheDAC,
ses.session_id,
ses.login_time,
ses.status,
ses.original_login_name
from sys.endpoints as en
join sys.dm_exec_sessions ses on
en.endpoint_id=ses.endpoint_id
where en.name='Dedicated Admin Connection'

Now when SQL Server Breaks on you, there is a tool that prevents us from shattering like glass.

Hey what’s that thing I can’t remember…determining RAM amounts

I frequently am trying to determine how much RAM I can give to SQL Server without starving the OS.  Since my servers usually only have SQL Server on them and no other applications, I can give them everything except 2-4gb for the OS. Depending on what that server is doing and if it still looks hungry, I will give the OS more or SQL Server more.  The question is, if I can’t remote on to the server, how do I know how much RAM is on the server? After much searching and calculating, I have come up with a T-SQL query to help. It will tell you how much RAM is on your instance.

SELECT physical_memory_kb * 9.5367431640625E-7
FROM sys.dm_os_sys_info

 

It uses the system views which I haven’t begun to scratch the surface of their awesomeness, but plan to learn more.

I got 99 problems but Autogrowth ain’t one because of CMS

I recently took over a new environment and have had my hands full exploring and setting up all my checks.  When I attended users group a few weeks ago I realized I was doing it the hard way.  I was connecting to each server one at a time to run my install t-sql scripts and checks.  Mike Tutor gave a fantastic presentation on CMS (Central Management Servers) in SQL Server and how to get started using it.  Today I had another issue with a database logfile that had autogrown to an unruly size.  I started to do the math and realized that if I didn’t learn how to use CMS, I would be fixing autogrowth settings all week.  So let’s begin.  This article assumes that you have already registered your Instances into Groups within CMS.  Right-click on your group and select New Query.  Run this query to see where you are at:

 
SELECT sd.name AS DBname, smf.name AS Filename, smf.size, smf.max_size, smf.growth, smf.is_percent_growth
FROM sys.master_files smf
INNER JOIN sys.databases sd ON smf.database_id = sd.database_id

This will give you both the Server Name and the database name on all your files if you are running in CMS but will also work if you are just on one local server.

Next, I am going to use Policy Based Management to Apply a Default Value across all my databases. On a test server, you want to create a new condition:
CreateCondition

Whatever you set that @Growth value to is the value that will be set on all your files that you apply this policy on, please use caution and plan a value that will fit your growth needs.

Then pull it into a policy:
CreatePolicy

Right Click on the Policy and Evaluate the policy to make sure it will do what you are expecting it to do. The green ones were already matching what your policy would do. The red ones are the ones the policy would update. Under Details select View to see what the values are now. You can test out how it is going to work by checking a check box or two and selecting Apply.

Export the policy by right clicking on it and selecting “Export Policy”.

Then go back to your CMS Group and right click to Import Policy. Right click on the group again and select Evaluate Policies. Find the one you just imported and check the box for it and run it. This is the same as before, select the check boxes of the ones you want to update and Apply.

Next, I want to be able to control the rate each system database will grow, this is just an example. Please plan your growth and modify as needed. Right click on your CMS group and select New Query and paste this in then modify as needed. (You could also do this with a more specialized policy, but I wanted to use both.)

 
ALTER DATABASE [master] MODIFY FILE ( NAME = N'master', FILEGROWTH = 240MB ) 
GO 
ALTER DATABASE [master] MODIFY FILE (NAME = N'mastlog', FILEGROWTH = 160MB ) 
GO

ALTER DATABASE [msdb] MODIFY FILE ( NAME = N'MSDBData', FILEGROWTH = 240MB )
GO 
ALTER DATABASE [msdb] MODIFY FILE (NAME = N'MSDBLog', FILEGROWTH = 160MB ) 
GO

ALTER DATABASE [model] MODIFY FILE ( NAME = N'modeldev', FILEGROWTH = 240MB ) 
GO
ALTER DATABASE [model] MODIFY FILE (NAME = N'modellog', FILEGROWTH = 160MB ) 
GO

ALTER DATABASE [tempdb] MODIFY FILE ( NAME = N'tempdev', FILEGROWTH = 240MB ) 
GO 
ALTER DATABASE [tempdb] MODIFY FILE (NAME = N'templog', FILEGROWTH = 160MB ) 
GO

Many Chocolate Covered Gummy Bears gave their lives to bring you this information.

Transparent Data Encryption (TDE) in SQL Server

Ben Miller deserves credit for this post.  I have picked up a little bit of information about TDE here and there but he pulled it all together for me.  I presented this at work to give a broad overview of what it is and why we would use it.  I will be creating a more in depth post later.

What is it?

  • Encrypted Data at rest.
  • AES (128,192,256) or 3DES
  • Encryption is performed at the page level.
  • Datafile, Logfile and Tempdb are encrypted.
  • Tempdb is encrypted at AES 256 and you can’t change that.
  • FileStream data is not encrypted when TDE is enabled.
  • Protects against people stealing your files.
  • SELECT statement results are not encrypted so it is Transparent to the user.

Benefits

  • No Schema changes like cell level encryption.
  • Page level encryption
  • MSFT estimates degradation at 3 to 5% instead of 20 – 28% that occurs with cell level.
  • Secure backups by default.
  • Invisible to user

Disadvantages

  • Backup compression is not useful when TDE is enabled.
  • Enterprise Edition Only
  • With Cell level encryption, you have finer control over encrypted elements
  • Tempdb is encrypted even if only one database is encrypted.
  • Instant File initialization is not available when TDE is enabled.
  • If you lose your Certificate, your data is gone.

Happy Encrypting!

Andrea