Monthly Archives: July 2010

Reads and Writes per DB, Using DMV’s

So years ago when I was working with a SQL 2000 database I had a need to see how many reads and writes were happening on each DB so I could properly partition a new SAN we had purchased. The SAN guys wanted Reads/Sec and Writes/Sec and IO’s/Sec total. Perfmon couldn’t give this for each DB so I had to use fn_virtualFileStats. I wrote a procedure that would tell me per DB what was going on and then store it down into a table for later comparison.

I’ve found a need for this again since I’m running some load tests and want to know what my data files are doing. This is easier now thanks to sys.dm_io_virtual_file_stats. It’s even better now that so many people in the community provide great content! Glen Alan Berry (Blog | Twitter) wrote and excellent DMV a day blog post series and in one of them he gives a great query on Virtual File stats. This gives you an excellent general point in time look at your server but was not exactly what I needed. I wanted to know for a specific period what were my reads and writes. This works better for my load testing needs to know just what is going on during a specific time.

You might want to use this during heavy business hours as well. The DMV is keeping track since the last Server restart so if you look at just the DMV you’re going to see everything that has occurred. Maintenance items (checkdb, Indexes, Backups) will all show up inside there so one DB might be very large in size and at night when the backup kicks off it might be using lots of Reads making it’s percent much higher but during the day no one really uses the DB so the DMV might show you that the DB is very busy when in reality your other DB’s are the busy one’s during the day. So If I’m planning for a new SAN or moving around files I would run this query at specific times so I could compare data points and find out what my databases are actually doing during critical times.

This script takes a Baseline record on the DMV. It then waits the amount of time you specify and takes a comparison line. It then compares the two and returns the percent’s. Lots more could be done with the comparison and certain things I’ve left in for later changes like Size. Right now I don’t do anything with Size but I plan to show the growth from the data points. I chose not to create the script as a stored proc just so it’s easy for you to put where you want. It only deals with 2 data points to keep it simple for now I have considered changing this in the future. The only setting you need to change is the @DelaySeconds parameter. Just set that to the number of seconds you want it to wait. 60 seconds is usually a good default to get a quick snapshot on the server. My suggestion to discover what your databases are doing during busy times I would probably run it for about 300 seconds (5 minutes) and then do that 2-3 times in an hour to see what the data looks like.

Would love to hear any comments on if this works for you!  Thanks.

/*********************************************************************************************
File Stats Per DB
Date Created: 07-22-2010
Written by Pat Wright(@SqlAsylum)
Email: SqlAsylum@gmail.com
Blog: http://Www.Sqlasylum.com
This Scrpit is Free to download for Personal,Educational, and Internal Corporate Purposes,
provided that this main header is kept along with the script.  Sale of this script is
Prohibited in whole or in part is prohibited without the author’s consent.
*********************************************************************************************/

CREATE TABLE #FileStatsPerDb
(
ID INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
[databaseName] [NVARCHAR](128) NULL,
[FileType] [NVARCHAR](60) NULL,
[physical_name] [NVARCHAR](260) NOT NULL,
[DriveLetter] VARCHAR(5) NULL,
[READS] [BIGINT] NOT NULL,
[BytesRead] [BIGINT] NOT NULL,
[Writes] [BIGINT] NOT NULL,
[BytesWritten] [BIGINT] NOT NULL,
[SIZE] [BIGINT] NOT NULL,
[InsertDate] [DATETIME] NOT NULL DEFAULT GETDATE()
)
ON [PRIMARY]

DECLARE @Counter TINYINT
DECLARE @DelaySeconds INT
DECLARE
@TestTime DATETIME

–Set Parameters
/*
The counter is just to initialize the number to 1
The Delayseconds Tells SQL Server how long to wait before it runs the second data point.
How long you want this depends on what your needs are.  If I have a load test running for
5 minutes and I want to know what the read and Write percents were during those 5 minutes
I set it to 300. If I just want a quick look at the system I’ll usually set it to 60 seconds,
To give me a one minute view.  This depends on if it’s a busy time and what’s going on during that time.
*/
SET @Counter = 1
SET @DelaySeconds = 60
SET @TestTime = DATEADD(SS,@delayseconds,GETDATE())

WHILE @Counter <=2
BEGIN
INSERT INTO
#FileStatsPerDb (DatabaseName,FileType,Physical_Name,DriveLetter,READS,BytesRead,Writes,BytesWritten,SIZE)
SELECT
DB_NAME(mf.database_id) AS DatabaseName
,Mf.Type_desc AS FileType
,Mf.Physical_name AS Physical_Name
,LEFT(Mf.Physical_name,1) AS Driveletter
,num_of_reads AS READS
,num_of_bytes_read AS BytesRead
,num_of_writes AS Writes
,num_of_bytes_written AS BytesWritten
,size_on_disk_bytes AS SIZE
FROM sys.dm_io_virtual_file_stats(NULL, NULL) AS fs
JOIN sys.master_files AS mf ON mf.database_id = fs.database_id
AND mf.FILE_ID = fs.FILE_ID
IF @Counter = 1
BEGIN
WAITFOR
TIME @TestTime
END
SET
@Counter = @Counter + 1
END

;
WITH FileStatCTE (Databasename,Filetype,Driveletter,TotalReads,TotalWrites,TotalSize,TotalBytesRead,TotalBytesWritten)
AS
(SELECT BL.Databasename,BL.FileType,Bl.DriveLetter,
NULLIF(SUM(cp.Readsbl.Reads),0) AS TotalReads,
NULLIF(SUM(cp.Writesbl.Writes),0) AS TotalWrites,
NULLIF(((SUM(cp.Sizebl.Size))/1024),0) AS TotalSize,
NULLIF(((SUM(cp.BytesReadbl.BytesRead))/1024),0) AS TotalKiloBytesRead,
NULLIF(((SUM(cp.BytesWrittenbl.BytesWritten))/1024),0) AS TotalKiloBytesWritten
FROM
( SELECT insertdate,Databasename,FileType,DriveLetter,READS,BytesRead,Writes,BytesWritten,SIZE
FROM #FileStatsPerDb
WHERE InsertDate IN (SELECT MIN(InsertDate) FROM #FileStatsPerDb) ) AS BL –Baseline
JOIN
( SELECT insertdate,Databasename,FileType,DriveLetter,READS,BytesRead,Writes,BytesWritten,SIZE
FROM #FileStatsPerDb
WHERE InsertDate IN (SELECT MAX(InsertDate) FROM #FileStatsPerDb) ) AS CP — Comparison
ON BL.Databasename = cp.Databasename
AND bl.filetype = cp.filetype
AND bl.DriveLetter = cp.DriveLetter
GROUP BY BL.databasename,BL.filetype,Bl.driveletter)

/*
Return the Read and write percent for Each DB and file.  Order by ReadPercent
*/
SELECT databasename,filetype,driveletter,
100. * TotalReads / SUM(TotalReads) OVER() AS ReadPercent,
100. * TotalWrites / SUM(TotalWrites) OVER() AS WritePercent
FROM FileStatCTE
ORDER BY ReadPercent DESC,WritePercent DESC

In my haste I forgot to add a sample of what you would get from this.  The names of the DB’s have been changed.

readpercentsample

Should you Run for the Board?

I wanted to get this blog out today to talk about the PASS Board.  Hopefully if you are reading this are thinking about running this will help you to choose to run. 

First there have been some really good blogs on running for the Board already.  Here are the few that I really liked. 

Andy  Warren

Joe Webb

Jeremiah Peschka

I spent 2 years as a Board member.  It was very rewarding and here are some of the things I learned and reasons I came up with to join the ranks. 

1. How to Deal with Budgets.  When I started on the board I was a DBA I had little to no business experience .  I didn’t know that we had to manage millions of dollars I always just sent in an expense report to someone and expected to get paid.  This was a great eye opener for me and really made me a better Database person in the future when I could understand financials better.

2. How to deal with others and compromise.  Now most DBA’s are pretty good at this(not the compromise part)  since we have to deal with not seeing eye to eye with developers all the time.  But what about when you don’t agree with your fellow DBA?  The board taught me to work well with my peers and to compromise at certain times. 

3. How to manage people.  I have been quoted on many occasions to say I’m not a very good manager.  It’s something I’ve worked on for much of my career but still feel I have a lot of room to grow in.  It is something that the Board helped me on.  You have to manage volunteers that “work” for you.  How effectively you do this will matter greatly to the community at large. 

4.  Learning to take criticism.  I would be lying to say it’s all Bacon and Set based queries on the PASS Board.  The SQL Community is vibrant and vocal about suggestions.  In my opinion it makes the Board better and that helps to make the community better.  If your not vocal in our community right now you should be!  You need to tell PASS what they are doing wrong and right.  The more we give feedback the better things will be.  I learned a lot about taking criticism on the board and how to turn it into something positive.

5. Last but not least.  Working with people outside our community.  Did you know there’s people out there that have never heard of PASS!?!?! I know it’s a crime right?  :)  I learned that you can meet so many people and really talk to them about not only the PASS organization but the SQL Community in general and tell them how great it is to be part of it!  I love explaining to people what kind of community we have and how we help each other.  Being an Ambassador for PASS while I was on the Board was one of the most rewarding things I did. 

Hopefully these things I’ve learned will help you in your decision to run for the Board.  I apologize if I was a little long winded the community and the PASS organization mean a lot to me and hopefully it means a enough to you as well to put in the time and run for the board.  🙂

Here’s where you can put your name into running for the Board.

 

http://elections.sqlpass.org/