Tuesday, June 12, 2012

Hyper-V standalone gets Release Candidate at TechEd 2012

Here at TechEd 2012 Microsoft has released the Release Candidate for the standalone version of Hyper-V v3. This is considered the release that will be on par, or surpass, vSphere. You can download the RC version from Microsoft's website at http://www.microsoft.com/en-us/server-cloud/hyper-v-server/default.aspx.

Hyper-V v3 has doubled the capacity of v2 with logical processors supported per host have gone from 160 to 320; physical memory from 2 to 4 TB; and virtual CPUs from 1024 to 2048. Virtual CPUs supported per VM have also doubled from 32 to 64.

Now I don't know anybody that would even think about running a VM that large on Hyper-V. I have some playing around to do, but I don't think Hyper-V is ready for primetime workloads like vSphere is capable of. vSphere has the history, experience and respect to support critical workloads. Not to mention you'll have to patch your Hyper-V server every black Tuesday (or shortly thereafter).

I'll definitely load up Hyper-V v3 in my lab and take it for a spin. I need to get my VMM server up and running too to be able to take full advantage. Does anybody know if VMM 2012 supports Hyper-V v3? Now for some more homework.

Monday, June 11, 2012

Day 1 at TechEd

Day 1 at TechEd included meeting some great people and talking to some smart people. That is not to imply that I don't normally talk to smart people, I talk to myself all the time.



I had a little snafu at registration but they were very helpful in getting everything straigtened out. I worked the Veeam booth today during and was able to show the great functionality behind Veeam's Backup and Replication along with Veeam ONE. Surprisingly at a Microsoft event I didn't get many people asking about our Management Pack for VMware that works with SCOM (System Center Operations Manager). If you are interested check out www.veeam.com/scs2012 for a free 10pack of licenses for the nworks Management Pack.

It was great to talk with people about Hyper-V. I grew up learning VMware and I am a VMware nerd. This Hyper-V thing is new to me and I am learning it as fast as I can but in my normal day to day I don't talk with a lot of people about Hyper-V so it is nice to finally see people that use Hyper-V. I am waiting for the Hyper-V version of the VMUG, maybe the HVUG. Speaking of user groups, did any of you know that Veeam now has User Group meetings all over the country? Check out http://go.veeam.com/user-groups.html to see if there are any Veeam User Group meetings in your area. They don't have them on the website yet, but they are planning on have 2 user group meetings in Chicago, IL and 1 in Milwaukee, WI sometime in July.

Sorry, got a little sidetracked there. Back to TechEd, I didn't make it to any sessions today, I do plan on making a couple hopefully. I do plan on attending the Thursday night session (TechEd party). I will say there are a ton of people here, excellent place to talk tech, learn some new things and have fun. I apologize for not giving you too much of an update on anything that happened here at TechEd, as I was at the booth most of the time, I will hopefully have something for you tomorrow. I will leave you with some links I want you to check out:


Vote for Veeam ONE for Best of TechEd 2012 http://northamerica.msteched.com/bote > under Virtualization category Veeam ONE.

Check out Veeam's own Alec King present on "Massively Multi-Instance: Building and Deploying Microsoft System Center Operations Manager Management Packs in the Enterprise" at  Tuesday, June 12 at 10:15 AM - 11:30 AM in N320E.

Veeam announced an addon to extend the builtin MS SCOM Generic Report Library. This is an addon that does not require Veeam. They are giving this away to the community to enhance reporting within SCOM. You can use this on any data within SCOM. It is also not for the casual SCOM user, this is meant for the SCOM Administrator that needs to build highly detailed reports.

Stop by the Veeam booth for your "Veeam is for Lovers" tshirt


Sunday, June 10, 2012

Blogging from TechEd

I am on the plane on my way to TechEd. I will blog from the show floor and from different sessions that I attend. I Stay Tuned.

Sunday, May 6, 2012

Not everything in life is technology

Just got back from a great staycation with the family for some overnight fun at Key Lime Cove. Now I just got done making some French Vanilla Ice Cream. I new there was eggs in French Vanilla but didn't realize how much. With 4 cups of cream there are 7 or 8 egg yolks in there. Very simple recipe too, and I also realized that French Vanilla ice cream is pretty much egg nog without the nutmeg. If you're interested in the recipe here it is. I never said this was a just a technical blog, just a "techie" blog and this techie just made some ice cream.

7 egg yolks
4 cups of whipping cream
1 cup of sugar
2 teaspoons of vanilla extract

Mix yolks and 1/2 of the sugar in a bowl and mix well.
Take 1c cream and 1/2 of the sugar and slowly bring to a boil in med/large saucepan. When it gets to a boil slowly mix it into the yolk mix. Then transfer back to the pan and cook slowly for about 10 minutes, should be thick but not chunky. Turn off heat and set pan into ice water to cool. After it is cool pour into your ice cream maker and finish accordingly. Enjoy, it is very rich, you can substitute up to 1 1/2 cups of the cream with whole milk to lessen it up some.

Next week I will be updating on my c# progress, stay tuned.

Saturday, May 5, 2012

Starting on a new frontier

Have you ever been sitting there thinking "if only there was a program that did that." Well I ran into that the other day and decided it was time to actually sit down and learn some C#. I've dabbled very little in the past but never really had an end goal of what I wanted the program to do so I always got bored with it. Plus since I don't do it often I kept forgetting everything that I learned. This time I have a clear goal in mind, pretty simple actually.
The program will provide some text boxes that will take user input and then take an XML file and display some data that the user can change options on with the user input data and provide an output that will change data in a database. This need arose from having to change the rollup times for metrics in a monitoring program.
So far I have created the main interface, a calculator that converts time measurements into other measurements, such as X minutes into hours, seconds, milliseconds, etc. I have also created a DataGrid that pulls in the data from the XML file. I have to learn Linq to narrow my data in the DataGrid.
I will update this post with the different pieces that I will need to learn and how I went about obtaining my results.
Stay Tuned.

UPDATE: It seems like when you really want to start getting into something everything else picks up. I haven't been able to dig into the programming as much as I wanted. I've done some interface work in Visual Studio but haven't gotten down into the nitty gritty. I hope to in the next week or so, please stay tuned a little longer.

Thursday, May 3, 2012

What do you want to hear about?

Please post comments below to throw some ideas at me about what you want to hear about. I've got my own ideas but its not always about what I want. Wait a minute, this is my blog so it is all about what I want. Even though it is all about what I want, I want to know what you want. Post your comments below and shoot out some ideas.

How many backup servers do I need?

This has got to be the worst question that I can get asked because I don't have a direct answer. I always tell people "It depends" which sounds like a cop out but it is true. It depends on a lot of factors such as:

  • How much data are you backing up?
  • How fast is your production storage to send the data to the backup server?
  • How fast is your media server\proxy server to be able to process the data it is getting?
  • How fast is your network to handle the flow of data (unless doing LAN-free backups)?
  • How fast is your target storage for the backup files (doing disk-to-disk backups)?
  • How fast is your network from the media server\proxy server to your target storage?
  • How much data are you backing up each night?
  • How often are you doing full backups?
  • Are you doing active fulls or synthetic fulls?
  • How big is your backup window?
  • Is it ok to have your backups overflow into production hours?
  • Do you need to backup using application agents?
  • Are you backing up physical or virtual?
  • If virtual are you using image-based backup solutions to get the VM?
  • If virtual are you using VMware technologies like Change Block Tracking?
  • It all comes down to "how fast can your infrastructure go?"
You can relate this question to one in your personal life when someone asks "How long is it going to take you to get there?" Well you can give a rough estimate, but there are many things to take into consideration, what is the rate of travel, what is the traffic like, which streets are you going to take, are you going to take a bike or your car or a bus or are you going to walk? All of these are similar to your backups. It depends on the mode at which your going to get your data from point A to point B and how congested is that route?

Now it gets much easier when these questions are answered and the more information you give us the more detailed response you'll get. But nothing beats good 'ol testing. Test your environment. See how long it takes to back up one server, ok see how long it takes to do that incremental, ok now see how long it takes to backup two servers at the same time. When running things in parallel in the IT world it is not a linear improvement. Running one backup might take 30 minutes, running two backups might take 35 minutes and running three backups might take 45 minutes. There will definitely be a point of diminishing returns and that is where testing will prove to have paid off. You will get to a point where it takes you longer to back up than if you backed off one or two servers and waited for the one of them to finish running. Where that point is is entirely up to your hardware, most likely it is not the software. Most backup software these days are built for multi-threading and to utilize the resources as efficiently as possible but you can't magically squeeze more out of your hardware that what its capacity is. Now some products offer compression and deduplication that will allow you to technically send more data than others.

I hope you enjoyed my little lesson here. It is something I learned when I came to the dark side (more on this later).