Category Archives: DBA

DBA Job Hunting Tips Part 2

I wanted to follow up to this post DBA Job Hunting Tips.  I received a resume from one candidate that I really liked how they organized the information about themselves.  It went in line with many of the things from that first blog post.  The candidate didn’t want to share the resume so I simply made a new header for my resume and will use that.  Below is an example of a really simple clean header that gives me all the information I need to look this person up online and saves me a lot of time.

As you can see I listed out my blog, Twitter feed and LinkedIn profile.  This gives the person looking at this resume easy access to all the work related information they need for me online.  If you speak often a link to your SpeakerRate profile would be a good idea as well.  You should always go out of your way to make it easy for a hiring manager to find information about you.   That person may be going through hundreds of resume and saving the hiring manager time is a good idea.

Since I’ve been doing interviews all week here are a few tips as well.

  1.  Be passionate!  Make it sound like you really want to work here you are applying for the job so I’m assuming you really do want to work here.  This is a very important point for me.
  2. You will not have all the answers, Let me make that 100% clear you will *NOT* have all the answers.  Don’t  be afraid to say I don’t know but I have these resources to use at my disposal (google,twitter,blogs,friends and etc…) and describe how you would find the answer.
  3. Don’t afraid to be technical.  You may be doing an interview with a HR person that doesn’t know SQL from Perl.  Don’t be afraid to ask how technical do you want me to go?  List out direct statements if you can or if they want a technical answer.
  4. Be early.  Notice I didn’t say on time.  Again show me that you want to work here if you are excited and really want to work here then you need to show me that.
  5. Practice!  Hold mock interviews with your friends.  Go present at a user group to get used to talking in front of people. If you can get in on doing an interview I suggest it to get another perspective.
  6. Keep a clean appearance,  If this is a phone interview you can ignore this but sooner or later you’ll need to do a in person face to face interview when that happens dress appropriately.  I’m not a big fan of a suit and tie for interviews typically because I’m interviewing developers / DBA and we don’t normally wear that every day in our work environment. Wear something you will be comfortable in but still looks professional.  Bonus points if you wear a really geeky shirt like from www.thinkgeek.com.

The key item on this list for me is #1.  When someone does not a question I see that as a teaching opportunity for a candidate that really wants to learn.  If the person is not passionate then I’m assuming they don’t want to learn and don’t want to move forward with the career they have chosen.

These tips are very subjective as everyone interviews and hires people differently this is what works for me and what I look for but no matter who you are talking to if you are passionate about the subject it will show.

DBA Job Hunting Tips

Wow 4 months without a blog post.  I’ve been a lazy blogger.   🙂

In reality this seems to happen to me when I start a new job.  I get so focused on creating a new project I lose a lot of time to write blog posts or participate in social networks.   I’ve taken on a few new contracts as well and volunteer items so it’s been a busy 4 months for me.  I’m going to try and get into the habit of writing up a blog post a week.  I have 2-3 already written/drafted so those should get scheduled in the next few weeks.

At my last job I was responsible for hiring a new DBA for the team.  I’m in the same position at my current company and am looking for a DBA.  I’ve gotten a few resumes already and just wanted to share what I typically go through to review candidates.  Now keep in mind this is before I have talked to them, emailed them or scheduled some sort of interview.

  1.  Google the candidates email address.   I look at the results returned but also look at the discussions list to find what Google groups they might have been asking questions in and what kind of questions.
  2. I search on Linked in by the name of the person.  Google can find this as well I like to put a location behind the name like SLC so it finds the local person.  That is of course if the person is local.
  3. I review what I find from Google and LinkedIn.  If they are on LinkedIn I find out who else is in my network I can possibly ask for a review.  I review the information listed on LinkedIn and the reviews they have gotten from other members of LinkedIn.
  4. If they have blogs/twitter and I can find that from the email or name that was given then I review that information as well.

These are the typical steps I do before even calling/speaking with a candidate.  This doesn’t mean that I’m not going to interview someone just because of what I find.  It just provides me with more information about the candidate.

So if you are out there looking in the market for a DBA position (like this one! (http://www.allegiance.com/company/career-opportunities#da) I would suggest placing this information clearly on your resume so that it’s easy to find.  It will at least make my life easier and I’m sure others will appreciate it as well.

DBA Week 2 A follow up to your first day.

I wanted to get a follow up post to my recommendations for DBA’s for the first day on the job post.   After a very busy week last week here is where I stand on my list and what I should add for the next week.

  1.  Get a list of Servers I’ll be managing/owning: I’ve got the list, can’t say I have access to all of them but working through that.
  2. Run the SP_Blitz: I’ve got this done on several servers and honestly haven’t done a ton of analysis on it yet.  Still high on the list.
  3.  Get a DBA database created on each server: Done for known servers.
  4. Create the server side trace: Done for the primary prod servers.
  5. Create perfmon counters: Done for the primary prod servers.
  6.  Install SSMS tools pack: Done (and it’s already saved me),
  7. Check the backups/backup schedule: Taken care of by a managed host provider, many changes in store for this one.
  8. Check the security model and who has access to the DB: Figured out most of this and some changes are in plan for the future as well.

Things to Add after your first week.

  1. Developer communication/training.  We have a developer training planned this week and I’m prepping a presentation based on performance tuning and some of the procedures I’ve worked on over the last week.   It’s a great time to start building relationships with the DEV team and working with them to improve the app and server even more.
  2. Slow procedures/query tuning.  Now that you’ve had your trace and performance counters in place start researching what you can do to improve the performance of the server.   Perhaps hardware is needed or memory settings need to be changed.   We saw a big increase because of a server mis-configuration in the memory settings.
  3. DMV’s,  I want to get these into the first week list but didn’t have a chance and don’t have an automated procedure but am researching some right now and plan to have them running in the next week.  Capturing query info and index info in DMV’s are key.
  4. Optimizations, unfortunately there is still a lot of mis-informaiton out there about what needs to be re-indexed/rebuilt and checked.  I’ve found a very heavy optimization job running on the servers and need to re-factor this to really help the performance of the server.  This is top on my list as it’s running into prod time and needs to get fixed.  I’ve also found Shrink jobs running (never a good thing).
  5. Last major suggestion for this week is be cautious.   You’ll find lots of things that you want to change and you may be tempted to start making wide sweeping changes but be careful.  If your systems haven’t been looked at by DBA eyes for a while then making large sweeping changes could have a huge impact on the system.   Make sure to document and test anything you put into place.  One method I use for this is getting a mailbox setup that I can email my changes to and store them that way I have a history.  If you have a Team of DBA’s with you this works well to inform all of them at once.

Hopefully I’ll get a chance to blog out mid next week with some of the scripts I’ve been talking about.   Let’s see how this week 2 goes for me. 🙂

First day on the job as a DBA

So I’m starting out a new job today and figured I would write up a checklist of things I needed to complete to own the Databases I have to start managing.

  1.  Get a list of Servers I’ll be managing/owning.  This can be done with a script or just by the company letting you know which ones.  It’s not a bad idea using the script to find some other servers that might be out there that the company was not aware of.  Pinal Dave has a good example of this on his blog.
  2. Run the SP_Blitz script from Brent Ozar (blog | Twitter) this is full of all sorts of good tidbits on what you need to look at for a server.  Check out the blog post in the link above and it will fill you in on the many great things this script can do for you.
  3. Get a DBA database created on each server.  I typically use this to house all sorts of maintenance type things and monitoring and alerting.  Every DBA should have a DB to call their own.
  4. Create the server side trace.  Years ago back in SQL 2000 I wrote a server side profiler trace that would run all the time and store the contents down to a table that I could later review to watch performance of queries hitting the DB.  Over the year’s fellow DBA’s have made updates to the script and made it better and it’s one of those key things I still depend on now.  I should get a future blog post out on this one.
  5. Create perfmon counters.  I store perfmon counters down to a db through an odbc connection in perfmon.  This way I can use a custom set of reports I’ve created to report trends on my server/db performance.  This also gets me started on my benchmark and knowing where things are when I started.  I should have some future blogs on this.
  6.  Install SSMS tools pack.  No one should go without this tool.   Saved me more times than I can count I would suggest it to everyone.
  7. Check the backups/backup schedule.  Blitz script is going to tell me some of this but I want to make a point to check with the business as to what the backups schedule is and make sure that matches what is really happening.
  8. Check the security model and who has access to the DB.  Again Blitz script might tell me some of this but I’ll make it a point to make sure something like domain admins don’t have full rights to the db and there are 100 people in domain admins (yes I’ve seen this before).

This is all the primary things I could come up with on day 1.  There’s still more items I’m sure but these are key and all have to be in place so I can continue moving forward from here I should have a good base to work with.

If you have other suggestions  let me know!

T-SQL Tuesday, Why are DBA skills necessary?

I wanted to add my thoughts on this month’s T-SQL Tuesday.  Haven’t heard about T-SQL Tuesday yet? Here are the details on it.

What is T-SQL Tuesday?

If you haven’t seen it yet, T-SQL Tuesday is the brain-child of fellow-MVP Adam Machanic (blog|twitter). It happens once a month on the 2nd Tuesday (different this month) and is hosted by a different person in the SQL community each time. The idea is that it’s kind of a rallying point to get a bunch of people to blog about a particular topic on the same day, giving a wide and varied set of opinions and interpretations of that month’s topic. The host then posts a wrap-up commenting on all the contributions. I think it’s a great idea and I contribute whenever I have time.

The host for this month is Paul Randal (Blog | Twitter)  and here is a link to the original invitation for T-SQL Tuesday  post.

In my opinion DBA’s are the bridge between Development and Operations.  DBA’s possess that level of Technical detail that can allow them to speak in SQL but at the same time talk to the end user/PM/Manager/Customer.  A DBA that has these skills can be very effective in fixing problems when they arise.  Many people joke about what DBA stands for, DBA = Default Blame Acceptor typically.  Whether you accept the blame or not as a DBA it’s a fact that typically you will have many fingers pointed at you when something goes wrong.  Typically in the DBA world you are guilty until you prove otherwise. To be able to handle situations that arise regardless of where the blame lands is why you need DBA skills.

DBA’s typically possess 2 very important skills that set them apart.  Being able to see the 30,000 foot view, like it or not the DB is usually the central source for many applications and processes in a company.  The DBA has to understand this and know how lots of different pieces  work together.  They never have the luxury of only writing there one small piece of code that fetches bacon all day long (apGetBacon stored proc would be nice) .  They need to understand the server, network and code to know how it all works together.

DBA’s are usually very patient as well, having the finger pointed at you constantly for performance problems or size issues (I’m speaking of hard drives of course) makes you learn to be patient and work through the problems.  Most of the time it does appear to be a database issue since that is typically what is central to all the different applications in the company so the DBA has to take the time to find the real problem and to defend the database.

It’s a simple fact that the DB is one of the most critical components of many organizations. This fact alone tells us that DBA skills are very necessary to help improve move the business into the future.  While this is a great question to ask I hope that businesses start focus on  when do we need a DBA to handle one of our most precious commodities?

Reads and Writes per DB, Using DMV’s

So years ago when I was working with a SQL 2000 database I had a need to see how many reads and writes were happening on each DB so I could properly partition a new SAN we had purchased. The SAN guys wanted Reads/Sec and Writes/Sec and IO’s/Sec total. Perfmon couldn’t give this for each DB so I had to use fn_virtualFileStats. I wrote a procedure that would tell me per DB what was going on and then store it down into a table for later comparison.

I’ve found a need for this again since I’m running some load tests and want to know what my data files are doing. This is easier now thanks to sys.dm_io_virtual_file_stats. It’s even better now that so many people in the community provide great content! Glen Alan Berry (Blog | Twitter) wrote and excellent DMV a day blog post series and in one of them he gives a great query on Virtual File stats. This gives you an excellent general point in time look at your server but was not exactly what I needed. I wanted to know for a specific period what were my reads and writes. This works better for my load testing needs to know just what is going on during a specific time.

You might want to use this during heavy business hours as well. The DMV is keeping track since the last Server restart so if you look at just the DMV you’re going to see everything that has occurred. Maintenance items (checkdb, Indexes, Backups) will all show up inside there so one DB might be very large in size and at night when the backup kicks off it might be using lots of Reads making it’s percent much higher but during the day no one really uses the DB so the DMV might show you that the DB is very busy when in reality your other DB’s are the busy one’s during the day. So If I’m planning for a new SAN or moving around files I would run this query at specific times so I could compare data points and find out what my databases are actually doing during critical times.

This script takes a Baseline record on the DMV. It then waits the amount of time you specify and takes a comparison line. It then compares the two and returns the percent’s. Lots more could be done with the comparison and certain things I’ve left in for later changes like Size. Right now I don’t do anything with Size but I plan to show the growth from the data points. I chose not to create the script as a stored proc just so it’s easy for you to put where you want. It only deals with 2 data points to keep it simple for now I have considered changing this in the future. The only setting you need to change is the @DelaySeconds parameter. Just set that to the number of seconds you want it to wait. 60 seconds is usually a good default to get a quick snapshot on the server. My suggestion to discover what your databases are doing during busy times I would probably run it for about 300 seconds (5 minutes) and then do that 2-3 times in an hour to see what the data looks like.

Would love to hear any comments on if this works for you!  Thanks.

/*********************************************************************************************
File Stats Per DB
Date Created: 07-22-2010
Written by Pat Wright(@SqlAsylum)
Email: SqlAsylum@gmail.com
Blog: http://Www.Sqlasylum.com
This Scrpit is Free to download for Personal,Educational, and Internal Corporate Purposes,
provided that this main header is kept along with the script.  Sale of this script is
Prohibited in whole or in part is prohibited without the author’s consent.
*********************************************************************************************/

CREATE TABLE #FileStatsPerDb
(
ID INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
[databaseName] [NVARCHAR](128) NULL,
[FileType] [NVARCHAR](60) NULL,
[physical_name] [NVARCHAR](260) NOT NULL,
[DriveLetter] VARCHAR(5) NULL,
[READS] [BIGINT] NOT NULL,
[BytesRead] [BIGINT] NOT NULL,
[Writes] [BIGINT] NOT NULL,
[BytesWritten] [BIGINT] NOT NULL,
[SIZE] [BIGINT] NOT NULL,
[InsertDate] [DATETIME] NOT NULL DEFAULT GETDATE()
)
ON [PRIMARY]

DECLARE @Counter TINYINT
DECLARE @DelaySeconds INT
DECLARE
@TestTime DATETIME

–Set Parameters
/*
The counter is just to initialize the number to 1
The Delayseconds Tells SQL Server how long to wait before it runs the second data point.
How long you want this depends on what your needs are.  If I have a load test running for
5 minutes and I want to know what the read and Write percents were during those 5 minutes
I set it to 300. If I just want a quick look at the system I’ll usually set it to 60 seconds,
To give me a one minute view.  This depends on if it’s a busy time and what’s going on during that time.
*/
SET @Counter = 1
SET @DelaySeconds = 60
SET @TestTime = DATEADD(SS,@delayseconds,GETDATE())

WHILE @Counter <=2
BEGIN
INSERT INTO
#FileStatsPerDb (DatabaseName,FileType,Physical_Name,DriveLetter,READS,BytesRead,Writes,BytesWritten,SIZE)
SELECT
DB_NAME(mf.database_id) AS DatabaseName
,Mf.Type_desc AS FileType
,Mf.Physical_name AS Physical_Name
,LEFT(Mf.Physical_name,1) AS Driveletter
,num_of_reads AS READS
,num_of_bytes_read AS BytesRead
,num_of_writes AS Writes
,num_of_bytes_written AS BytesWritten
,size_on_disk_bytes AS SIZE
FROM sys.dm_io_virtual_file_stats(NULL, NULL) AS fs
JOIN sys.master_files AS mf ON mf.database_id = fs.database_id
AND mf.FILE_ID = fs.FILE_ID
IF @Counter = 1
BEGIN
WAITFOR
TIME @TestTime
END
SET
@Counter = @Counter + 1
END

;
WITH FileStatCTE (Databasename,Filetype,Driveletter,TotalReads,TotalWrites,TotalSize,TotalBytesRead,TotalBytesWritten)
AS
(SELECT BL.Databasename,BL.FileType,Bl.DriveLetter,
NULLIF(SUM(cp.Readsbl.Reads),0) AS TotalReads,
NULLIF(SUM(cp.Writesbl.Writes),0) AS TotalWrites,
NULLIF(((SUM(cp.Sizebl.Size))/1024),0) AS TotalSize,
NULLIF(((SUM(cp.BytesReadbl.BytesRead))/1024),0) AS TotalKiloBytesRead,
NULLIF(((SUM(cp.BytesWrittenbl.BytesWritten))/1024),0) AS TotalKiloBytesWritten
FROM
( SELECT insertdate,Databasename,FileType,DriveLetter,READS,BytesRead,Writes,BytesWritten,SIZE
FROM #FileStatsPerDb
WHERE InsertDate IN (SELECT MIN(InsertDate) FROM #FileStatsPerDb) ) AS BL –Baseline
JOIN
( SELECT insertdate,Databasename,FileType,DriveLetter,READS,BytesRead,Writes,BytesWritten,SIZE
FROM #FileStatsPerDb
WHERE InsertDate IN (SELECT MAX(InsertDate) FROM #FileStatsPerDb) ) AS CP — Comparison
ON BL.Databasename = cp.Databasename
AND bl.filetype = cp.filetype
AND bl.DriveLetter = cp.DriveLetter
GROUP BY BL.databasename,BL.filetype,Bl.driveletter)

/*
Return the Read and write percent for Each DB and file.  Order by ReadPercent
*/
SELECT databasename,filetype,driveletter,
100. * TotalReads / SUM(TotalReads) OVER() AS ReadPercent,
100. * TotalWrites / SUM(TotalWrites) OVER() AS WritePercent
FROM FileStatCTE
ORDER BY ReadPercent DESC,WritePercent DESC

In my haste I forgot to add a sample of what you would get from this.  The names of the DB’s have been changed.

readpercentsample

Question Everything.

waterandchalkartfestival 033

The world is full of information. So much so that we may believe there’s nothing left to write or do. As I try to get back to blogging more frequently I find myself wondering what to write about. It’s not to say my day is not full of lots of things to do. The question I was asking myself is why should I write these things when the world already has great writers and great information out there?

The answer of course is simple. Question Everything.

Ask any of the experts in a field of study and they will tell you one of the best ways to learn a new topic is to present on it. You force yourself to learn about what you’re presenting so that you are prepared for it. Now don’t get me wrong I use the scripts available out in the community all the time. I suggest them to clients and to fellow DBA’s. I am not advocating re-inventing the wheel. They are no doubt an essential tool in helping to make our lives easier and better.

Just because you get the perfect script to meet your needs doesn’t release you from responsibility from what it does. If you install a new script on your servers and it slows things down or has a negative side effect that the original designer didn’t mention or you didn’t realize, then your boss is not going to blame that on the person or blog you got the script from. They will blame you for releasing it into the environment.

So what should you do? Well as the person using the script you should read what the person who designed it wrote about it. Most of the time the designer of the script will detail what it does in blog posts or in some sort of documentation if by chance it doesn’t have that then dissect the script and figure out what it does. I prefer to take apart a script myself so that it forces you to learn how to read other’s code and how to figure out what it is doing which to me is an essential part of being a DBA. We rarely get things in the format we expect. Unfortunately many people don’t think like a DBA.

For the Senior/Master/gurus out there don’t just hand your new guys scripts and say run this and it will do what you need. Perhaps tell them you have a script that does what they need and here are the ways you did it perhaps the new guy can find a new better way. This teaches them and guides them at the same time. A good example of this is Quest’s new Perfmon counter poster. You have a lot of information at your fingertips but don’t just look at the benchmark numbers and assume that’s what works for all situations make sure to benchmark your own servers and find out what numbers are right for your situation. As I read through this new poster this weekend I realized there was some things I could change in what I was tracking to give me more information about my server.

If your one of the ones out there writing the scripts THANK YOU! I applaud anyone that has put something out there for others to use and that has helped make our lives a little easier. Keep up the good work and keep doing what you’re doing.