Monthly Archives: January 2010

What Three Events got me here?

I was once again tagged by Tjay Belt (Blog|Twitter) to write about what got me to where I’m at. I was also tagged by Allen Kinsel (Blog|Twitter).  Thank you both for thinking about little old me. 

To know where we are we must know where we have been.

I got a computer.

Yes it was a TRS-80. I had it hooked up to a small (13”) T.V. I did not have the tape recorder that saved programs so basically I just messed around in Basic wrote something and then wrote it again and then again. I learned to loop and put funny words all over the screen and in various colors. This really didn’t drive my interest much farther than that. Shortly after this I got a NES which was tons more fun and I spent countless hours playing it. This spurred my interest in electronics and computers much more than the TRS-80 ever did. My final computer as a kid was a Packard Bell 485 33MHX dx this computer was really what started me down the path. I remember coming home at midnight after a football game (I lived in the middle of nowhere and bus rides were really long). I went right into the box and started opening even though my father said I was not allowed to touch it or set it up until I read the entire manual. Ya I still don’t read the entire manual. I had the computer setup in 20 minutes and was starting to use it with dos 6.0. Most of my beginning years were just using the computer to talk with people. I logged onto many BBS’s as a way out of the tiny small town in the middle of nowhere. I did start to program a little in Pascal and could use DOS very well. Creating Batch files and changing memory settings so I could try and play some of the games my friends had.

I went to College (not for long)

I knew I wanted to get into computers from my time with my Packard Bell. I always envisioned myself more of an electronics person though than programming. I also was heavily into phones and phone systems (Yes I built a Red Box from a hallmark card). So during College I had every intention of getting a CIS degree and get into networking. I took some more classes about computers and programming even attempting my hand at Assembler (don’t want to do that again). I only went to College for a year. I decided to get married to the most wonderful woman in the world which meant needing a steady job instead of just College jobs.

I got a job

So I started into the work force like many of us in Customer Support. I remember they wanted experience in Windows 95. Which in College I had done dos and novell so I knew plenty of Cmd line gui’s but hardly any visual gui’s like Windows. I was still able to get the job and over the next year learned much more about Windows. I also learned how important Customer Support is and how hard it is. I have lots of respect for most Customer Support techs. During this position I kept trying to work with the operations team as much as possible as I still had a drive to work with computers and networking. I thought for sure I would be a network admin and that was my goal. Then IT happened. I was a Team Lead at the company when they were purchased by another large organization. It wasn’t for the worst though it turns out that the large organization didn’t like how it’s customer support was being run but they did like how we did it. So the large organization turned over all it’s software and the job of support to us. We had to learn there software and processes’s and how to support all 30K customers. This was no small undertaking. Along with this company came a helpdesk application known at the Time as HEAT (now known as FrontRangeSolutions). This was a front end helpdesk application that had a backed to Sybase Sql Anywhere 5.5. Thus my first introduction to SQL and databases was born. I spent about another 2 years working with that database until we migrated up to SQL Server 7.0. From there I’ve done many various things with SQL Server but I still have a copy of some of my original SQL Anywhere scripts just as a reminder so I always know where I am.

During these 3 things many great people helped me along the way not only as a DBA but as a person. I give back as much as I can to the community and to other DBA’s because I would not be here if it were not for someone telling me that you can’t write a trigger that updates EVERY row in the 1 million row table whenever it fires.

Pat

Picture Obtained from Wikimedia.org Used Under Creative Commons Share.   Link to picture

Organization and Productivity My Goals for the Year

I was tagged by Tjay Belt (Blog|Twitter) at the beginning of the year and due to some big projects and a computer crash I’m just now getting back to posting my goals for the year.

This year will be a big change for me. I see this as a year for me to get organized and be productive both for my community and for my company. While my life is never simple I hope to try and simplify as many things as I can this year so my goals are simple and to the point.

Goals for Community

· I have no defined position at the PASS Organization and while I’m helping out on projects I really don’t know what the year will hold for me and PASS. I have every intention of helping out as much as possible like I have done over the last 4 years. I really want to blend my Photography passion with PASS and do even more at the summit with it. I have some ideas and we’ll see how those come together through the year.

· I have organized and planned 6 code camps over the last few years. While I love doing it I’m a DBA and a SQL guy and I’ve never put on a SQL Saturday! This will be the year to put one on.

· I want to present more. At least 4-5 times this year and somewhere besides my local group. I might invade some other SQL chapters or SQL Saturday’s in my area.

Goals for Work

· Get Organized. Trying to keep things simple and get them done. Rarely are my tasks simple projects but if I can break them down into smaller pieces then I can manage them better.

· Reach my goal with a system I’m designing. I can’t really specify this one but you will see blog posts on the ideas I’m working on.

Theme Word = Organization

In keeping with my goal this year I am keeping this blog post simple and to the point. Sure I have lots of other goals both personal and work/community related but these are my crucial ones If I can accomplish these my other items will come as well.

Fusion IO SSD Drive First Tests

DSC_0122

I had some more time to do testing with the Fusion IO card over the last week. Once again it’s proven it’s much more than just a standard SSD drive. It gains the benefits of SSD by having no moving parts and being all memory based. The big difference between the card and SSD drives is that this is plugged directly into the /motherboard using your PCI-E slot. This gives it a much faster transfer rate through the PCI-E slot than through the Sata controller like a standard SSD drive.

I used Brent O’s (Blog|Twitter) Article and script on SQL Server Pedia to get the IO numbers for the drive (Listed here). If you run the script from Sql Server Pedia then you can download my data that I’ve placed on Google docs.  Paste it into the raw data section and you can view it in the pivot table to directly compare to my results.  I have only included IO’s per/sec,  but all the counters are available in the excel file.   Here is the file on Google Docs.

FusionIoRandomsequential

The Fusion IO Drive is  a 160GB drive.  The card I tested retails at $6995

The san drives I tested against were 7 7200 RPM SATA drives in a 5+2 RAID 6 (7 total drives).  Connected via 4GB Fiber channel

So where can you use this effectively in your environment? If you’re a medium sized company and you have some Large Db’s (200-300GB) then placing the whole DB on this drive is not really a cost effective option. But here’s some options I’ve found success with.

1. Place your Tempdb on the Drive.

a. By placing Tempdb on the drive  I got a boost on basically all SQL Server activity since many of the day to day operations of SQL Server find their way into Tempdb. I’ll have a blog post in the future with some more specific numbers around this.

2. Place your Indexes on the Drive.

a. Given the extreme write and read performance of these cards if you place indexes on the drive you can also see a boost in performance. This does require a pretty big change since you have to drop and re-create your indexes into a new filegroup.

3. Use the card as a way to remove disk latency for testing

a. I have a project right now that needs very very fast disk access. I need to simulate a system that can do thousands of inserts a second in SQL Server. So Instead of trying to find a Raid 10 array on my san and getting it configured I can use this card to do this sort of testing right on my own box. It’s allowing me to find the fastest way to get data into my system without worrying about disk performance.

So here’s a list of some of the Pro’s and Con’s of the drives in general.

Pro’s

Blazing fast speed

Easy to install and setup

Con’s

Price

These cards will degrade over time.  Here’s a explanation as to why offered by Fusion IO.

Doesn’t NAND flash have a write limit? How does that effect the lifetime of the ioDrive™?

NAND flash has a limit on the number of writes that can be done to an individual cell. The particular limit depends on the type of flash used. For Single Level Cell (SLC) NAND, the limit exceeds 1,000,000 writes to a cell, whereas for Multi Level Cell (MLC) NAND, it is on the order of 10,000 writes. Hence, in order to exceed the limit of a single 80G ioDrive™, you would have to write almost 80PB (Petabytes) of data. Streaming data at 800GB/s to the card, it would take you 3.4 years of writing data non-stop to exceed the SLC limit.

Hopefully you can use the information I provided in the spreadsheet to determine your own ROI and whether the card is right for you.  Personally I have some more tests to run and will continue to post information on my write project that I’m working on with the Fusion Card as my testing platform.

pat

Using DTEXEC and /Set to run SSIS packages

I have not had  much of a need to run SSIS packages from the command line as most of my packages run in SQL Agent just fine.  I’ve written a SSIS package that I have to call many times so I can generate some load and the easiest way I could see to do this was just using a command line to run of it.  This was a very quick package I created and had some variables that I wanted to change the value to quickly and easily on each run so I could run the packages at different values. 

I looked online for the /set command and found Jamie Thomson’s(Blog|Twitter) blog about the command located here

This helped out a lot but was still having issues with the syntax coming out correctly the very last comment had a great suggestion to simply create a config file that you could then go read to get the full path.  That worked so well I figured I would document it for next time. 

How to get your path to your variable for your /set command

Open your package and go into Package Configurations.  You can do this by right mouse clicking on the open space on your Control flow you should see this menu.

preferenceswindow 

Choose Package Configurations.  Your presented with this window.

configwindow

Mark the Enable Package Configurations and Add a new configuration.

Choose XML file on the next window and place the file in a simple path easy to remember.

xmlconfigwindow

On the next screen choose the variable you want and mark the Value box

variablevalue

Go ahead and click Next then Finish. 

Now you can open this file and see the full path and exactly what the DTEXEC command needs. 

xmlpath

In my case it reads “\Package.Variables[User::iCounterEnd].Properties[Value]"

Now my DTEXEC command looks like this

DTExec.exe /set \Package.Variables[User::iCounterEnd].Properties[Value];1 /File "c:\package.dtsx”

This is allowing me to set the package to only run through 1 loop instead of whatever I had set in the package.  Make sure you go back to your package and disable/remove the package configuration you just created or it will keep using the xml file. 

This has made it really easy to do my testing so hopefully you can use this as well in the future and it will be a little less confusing than it was for me.