Author Archives: HoundTech

Whose Job Is It Anyway?

Disclaimer: This is a generalized rewrite of an article I wrote for a company newsletter. I’m adding here because I think we ALL need a little reminder.


Actually the better way to phrase that would be “Whose career is it anyway?” Right? I mean most of us prefer to think of how we spend our days as a career and not just a job.  We’ll come back to that later. For now let’s just think at the ‘job level’. What you do today, tomorrow and even maybe next week.

I’m aiming most of this to those of us who are employeed by someone else. Self-employed people and business owners usually are well aware of “Who’s responsible for my skills”

Who is responsible for making sure we keep up with the changes in the business world, or at least our little corner of it? Of course your supervisor is supposed to make sure you meet the minimums required to do your current job. That works if all you want to do is be a minimum employee, and punch the clock every day. Maybe you’ll be able to do that until you retire, but more likely your job function will change and you’ll find you don’t have the basic skills required to be even minimum. That, my friends, is all on you.

The company is responsible for making sure you meet at least the minimums, sure. Some of us are fortunate enough to work for a company that offers educational programs of several different types. There are reimbursement programs for job related education, there are company sponsored training sessions.  All of that doesn’t really have an effect if you don’t invest at least your time and maybe even some of your money to improve your skillset. I spend my own money (in addition to company education) to pay for my continuing education in my chosen career. I know, some of you are saying “But I’m not an IT person, I’m just a ”. I could go on and on about how that shouldn’t matter or I can simply quote the man who said it best.

 “..Even if it falls your lot to be a street sweeper, go on out and sweep streets like Michelangelo painted pictures; sweep streets like Handel and Beethoven composed music; sweep streets like Shakespeare wrote poetry; Sweep streets so well that all the host of heaven and earth will have to pause and say, “Here lived a great street sweeper who swept his job well.”” – Dr. Martin Luther King, Jr.

There are many morals to that quote but the one I want to draw your attention to today is this. Take pride in what you do and be the best possible at what you do. Your manager is not responsible for your mortgage or rent, you are. Your supervisor isn’t responsible for feeding your family, you are. The ‘Company’ isn’t responsible for your career, YOU ARE.

Spend a little of your time improving your skillset in whatever you do. If you want to improve and don’t have a clear direction, ask your manager, and if they can’t tell you, then ask their manager. Get online and use Google to search for “warehousing best practices” or “call center best practices”. Take an online class in Accounting Principles. Watch a YouTube video on something other than cute cat tricks. Read a book. LEARN something.

Even if what you learn doesn’t immediately apply, it will give you a depth of understanding of why to do your job in a certain way, or even inspire you to think of a better way to do it! That’s what increases your value, helps you move up in position and pay, and incidentally, makes it more likely you’ll stay employed.

At the end of the day, you can lose your job. However, if you’ve invested in your career and yourself, not only would it be easier to get a new job, but you’ll be better at your current job. So don’t just be “minimum”, be exceptional!

TestLab v2 – The aborted build

If you missed the first two parts to this , start here and continue  here….

<SNIP!>

So the reason for the long delay in finishing this is due to some hardware problems with my test server. What was going to work fine for a 2012 server, doesn’t work for crap in 2016.

The problem is in the CPU. The old server I had planned on using as a lab does NOT have a SLAT capable chip. Since that’s a requirement for 2016 Hyper-V, it’s kind of a show stopper.

However – all is not lost! Jason Helmick and Melissa Januszko cooked up a PowerShell Automated Lab Environment that uses Virtual Engine Lability to easily stand up a lab environment on any Windows 10 machine. You don’t even have to manually download the ISO files for the OS install. Now I can very easily stand up/ tear down a lab with little fuss.

So with the lab situation handled, I’m moving on!

My goals this year is to get better with DSC, Pester testing and to complete a build pipeline for work. Let’s see how it goes…..

Test Lab v2 – the Planning Stage

(For the introduction post : Rebuilding the Test Lab v2 )

Before getting into the actual planning let me describe the environment and restrictions.

  • Hardware
    • Dell 2950 w/ 2 Xeon 3GHz CPUs, 32 GB RAM, 12 TB JBOD Direct attached
  • Software
    • Windows 2012 R2 (full install), Hyper-V role, with Windows Deduplication running on the JBOD to keep file sizes on VHDs down. The license is through our company MSDN account, permissible as this is for testing and development.
    • Powershell v5 is installed
  • Network
    • Dual gigabit ethernet adapters connected to the company LAN.
  • Restrictions
    • As a “guest” on the company network, I have to be very careful to isolate traffic in and out of my test environment. I’ll use a Vyos VM Router to do this.
    • I have no System Center VMM, no VMware, just plain vanilla out of the box Windows.

 

Alright so with our tools laid out, let’s talk about goals. What do I want to do be able to develop and test on this box? What’s that going to take? I’ve got to keep this simple or I’ll go down a rabbit hole of setting up the setup of the environment so that in the end I’ll get bogged down in minutiae. That may come later but for now – simple wins over cool new stuff.

Goal 1 : Learning to work in more of a DevOps kind of environment with source control and a development pipeline for my Powershell based tools. For this we’ll need TWO Virtual subnets – one for Dev and one for Test. Since there will only be one or two people on this at a time, I can build this all on the same box for now. Later when this process becomes more mainstream it won’t be difficult to rebuild the infrastructure on a production box.

Goal 2: build as much as possible with DSC – within reason. This is that rabbit hole I mentioned above. True you can build out a full network from scratch with DSC and a server WIM, but I’ve never done that and in the interest of getting stuff running right now  I’m going a more old school route. Build a “Base” server in each subnet that is a multifunction server.  It’ll be a domain controller, with DHCP, DNS, Windows Deployment Services and a DSC Pull server. From THERE I can work on things that I’m either rusty or inexperienced on. Walk before run before fly and all that good jazz.

I might add a goal 3 later but for now this is good.  Let’s diagram this out so we can get that big picture over view.

Capture

Right. Next step, we build 3 VM switches, a VyosRouter and 2 “Base” servers.

See ya then!

 

Rebuilding the Test Lab v2

Last year I wrote an article for Powershell.org extolling the benefits of a home lab and how it didn’t cost much to build a basic one. You can read it here.

That lab has done me well, but things change and needs increase and opportunities arise. The needs changing obviously is ” I want to be able to run more VM’s without sitting and listening to my disk thrash for minutes to start one”.  The answer to that need is “buy more RAM or a SSD”, both of which have that nasty side-effect of costing money.  So I gritted my teeth and waited…

Fast forward and now my work is decommissioning physical servers due to them not being covered under a 4 hour service agreement. Also due to a ton of virtualization by yours truly. So there are a few functioning servers with older cpu’s and smaller disks sitting idle…. yeah right. Time for TestLab v2!

This time I’m doing things a little different. First of all, obviously I’m not building on a Windows 8/10 machine. Secondly this box, while small by server standards, is a big improvement over my home pc. Also I’m building this as a test lab for our team so it’s a little more ” official”.  I am using their hardware and network after all, I should share *grin*!

Now I’ve recognized a flaw in my process of “build, test, document”. Really it’s a side-effect of my mild ADD and the hectic work pace I keep up. Once I’ve built it and solved those problems and tested it and solved those problems, I kind of lose interest.  There’s no more puzzle to solve, just homework to do. bah.

So we’re going to try a NEW (to me) technique.  I’m going to write AS I build it, typing during those install waits, and reboots. I’m going to break this into a few parts.  First this introduction, followed by  a section on “Pre-build planning”,one on the actual build, then a wrapup “post-mortem” post.

. Let’s see how this goes!

 

Update Status

Once again life has bitten me in my best intentions.

I promised myself and both my readers that this blog was going to be a regular thing, and in the beginning I did OK.  Then life happened as usual. Between work projects and an increasing feeling of pressure to finish my MS Certifications, blogging fell by the wayside.  This has been somewhat exacerbated by my agonizing over every word going on the screen.

“What if I put out something wrong?”  ” What if I look dumb?”

I had an interesting conversation with a gentleman that I admire and respect who told me to stop agonizing and DO IT.  If there is a mistake, own it, fix it and move on. Spell Check, Grammar Check,  POST he told me.

Later on at TechMentor Don Jones and Jason Helmick talked about how people are slowed by a fear of failure. The “what if..” consumes them into paralysis. To grow, to move forward, we have to get over that. Conquer it.

That’s what blogging was like for me. I was foolishly trying to make every entry a Pulitzer prize type thing. Yeah, ridiculous I know. Especially when I have more fingers than readers.

So this is me “getting over it”.  I’ll post much more often. Things may not be perfect, but they’ll be honest and as accurate as I can be.

Here’s to the ride.

And now for something completely different………..

So I had the cool project going for the email account scripting and all of that mentioned in the previous post.

But then the inevitable happened. Pulled from that project to work on this more important project, pulled from that second project to work on putting out this fire. Blah blah blah the story of an IT Pro in a smaller company. It’s both my greatest joy and largest pain, so rather than a long pause between posts, I thought I’d dash out a little blog-blurb  about what’s going on.

The major project that pulled my attention away from the Exchange reporting project is actually related.  While I was working on that and sharing some preliminary results with some senior management folks, they say “ bah those people don’t need email – just delete them”.

Of course that throws a monkey wrench into the employee onboarding script I had written some time ago. See at that point, the policy was ‘Everyone gets a mailbox’ and I was a bit of a larval scripter, certainly not quite a toolmaker. Of course the result of that was a one size fits all script that had a lot of assumptions and not much reusability. So with the change in policy, I took one look at the old script and said  OH HELL NO! Time to refactor.

So what was a 200 line script that did several things all intertwined is in the process of becoming a full fledged module with six Advanced Functions plus a ‘controller’ script to tie them all together.

The bad news is that put my Exchange/Archive joining project on hold.

The good news is that this refactored User Profile Tools module will save us a lot of time, confusion and headache. Bonus round,I’ll have another subject for this blog  Smile

Until Next Time…………………

Exchange and Archive PST Blues

This may draw the ire of some big environment silo Exchange Admins, but our environment gets very little “administration”. Oh there’s patches done regularly, users added and removed, but that’s about it. I keep an eye on Event Logs for errors and disk space to make sure we’re not running out and make sure the DAGs are replicating. Done – end of list. There are simply too many other irons in the fire to deal with small “wouldn’t it be nice” things.

One of those things on the One Day List is the .PST files resulting from Outlook Auto-Archive. Back in the old days, we had to use archiving to keep file sizes down for performance reasons. However these days with newer versions of Outlook, setting the sync time on your cache keeps the amount of data loaded into Outlook to a manageable level. This, and the relative inexpensiveness of storage has made the archive unnecessary in our environment at least.

Just because the archive process isn’t needed doesn’t mean we can just seek and destroy all of the .PST files all over the place. No, we need to find them, and import them back into the users’ mailboxes on Exchange. This is a good news/bad news sort of situation here. The good news is that we backup all of our pc’s centrally using a file copy solution so I have copies of all the PST files on a backup drive. One nice easy place to find them. The bad news? We’re working toward a migration to Office 365 and so size does matter. How many of these mailboxes are going to be over 50 GB? Any? Some? All I could do is guess and I hate to guess. I’m paid to know (or at least find out)!

This looks like a job for POWERSHELL!

Allow me to finish setting the technical scene with these little tidbits. I just started running Windows 10 Insider Preview on my production workstation, and while I’ve messed about with PowerShell 5 in a lab or at home, domain-joined production pc feels different. Secondly, we’re still on Exchange 2010 with all that entails. I long ago added implicit remote sessions to a DC and one of my Exchange servers to my PowerShell profile. So I can get-mailbox and set-aduser without having to invoke command or remember to import anything. All set. Good! Right? Not so fast.

I figure I’m going to break this project into 3 pieces.

  1. I’ll write a script to pull the sizes of all the company mailboxes. Since I have users that seem to forget how to empty the Deleted Items folder, I want to break out the result into Inbox, Sent Items and Deleted Items as well as the total size of each mailbox. I’ll output this to a CSV file called MailboxSizes.csv
  2. I’ll write another script to scan our backup files for .PST files. I’ll make note of the path and the size and output this to a file called Archives.csv
  3. Then I’ll write a third script to import these PST files into their owners’ mailboxes.

Sounds simple right? well in concept, it was simple. Execution however was a wee bit trickier. See I forgot one of the gotchas of PowerShell remoting. See the objects returned from Exchange running in a remote session are not the same objects you get running locally. For example:

PS C:\> (Get-mailboxstatistics -Identity Greg).TotalItemSize.Value.ToMB()

This should return a number of megabytes equal to the total size of my mailbox. No, not remotely. Remotely your return looks like this:

PS C:\> (Get-mailboxstatistics -Identity Greg).TotalItemSize.Value.ToMB()

You cannot call a method on a null-valued expression.

……What the heck? So using my sleuthing hat, I worked backwards. I removed the “.ToMB()” to see if Value had anything in it that perhaps didn’t like the method. Interestingly, I got nothing. There’s your null-valued expression right there, but why is it Null? Let’s back off one more step to see where the problem is hiding.

PS C:\> (Get-mailboxstatistics -Identity Greg).TotalItemSize

642.5 MB (673,721,009 bytes)

….. OK some data. But why is the value property empty?

Hmm with a head scratch I pipe it to Get-Member and light dawns. Lots of methods but only one property and that’s Length.

I see the TotalItemSize property doesn’t even have a Value property, just a length. Also it’s hard to math things with a string value.

That’s when I remembered. Exchange 2010 was an early adoptee of PowerShell but it was PowerShell v2 and remoting wasn’t as mature. So I tried a few other things, Invoke-Command thinking that since the command was running on the Exchange server it might work. Alas, it wasn’t to be. I wasn’t going to RDP onto the server, write and run a v2 script, just to copy the file back to my desk. That would defeat the ‘automation’ part of this exercise. We have people coming and going every week. This reporting was going to be done several times! So after noodling around an hour or more trying to work around this deserialization problem, I decided to take the lemon and make lemonade.

Time to start working with the String I was given. I knew the best way to do this is with Regex (Regular Expressions) and honestly I had no idea how to start building an expression. Time to Bing-Google it. Bingo! An old Hey, Scripting Guy! blog post helped out. While the post was aimed at Exchange Online, the Regex demonstrated worked great for me, so I “re-purposed” that snip. I also added “Learn Regular Expressions” to my <To Learn List>.

With a little polishing I now had a working basis for my scripting project.

PS C:\> [Math]::Round(((Get-MailboxStatistics -Identity greg).totalitemsize -replace ‘(.*\()|,| [a-z]*\)’, ”)/1MB,2)

642.5

In my next post it’s time to put some toolmaking magic on this puppy!

Avoiding Post-Ignite Flameout

Or should it be “Post Ignition Charring”?
We all know the feeling. You just spent a week or so at a major conference (we’re looking at you, Microsoft Ignite), getting your head crammed with new and exciting tech things. Now you are back at work and you have to digest it all and decide what to implement, how to implement it, what to investigate further. It can be overwhelming and rather than letting this surge of creative ideas fade, here’s some suggestions on how to ride the wave and not be drowned by it.

  1. Divide and conquer – Grab a sheet of paper or open your favorite note taking software and make a list of all the interesting new things you heard about. It doesn’t need to have a great deal of detail at this point, we’re just capturing the first glance for now. Now that you have a list, go down it and split it into two lists.
    • The first is “Learn for (Insert employer)”. This is stuff that is now, or will be soon part of your daily work. The stuff you have to stay on top of to keep those paychecks flowing. Also this is probably the reasons they sent you to that big conference in the first place, so it’s probably smart to show that return on investment for them.
    • The second list is “Learn for career”. Here’s where you put all the stuff that you know is going to be relevant for some time but your current employer doesn’t use or doesn’t use yet. For example, Desired State Configuration, Azure cloud services, etc. These are the things that you have to learn on your own to stay relevant in a rapidly changing tech world.
  2. Triage – Go through each list and separate each of them in to two or even three lists. These are based on order of implementation or time frame of implementation.
    • Currently Available – these are the things that are actually available now. You could install or buy this today and start working on implementing. Of course there is a testing phase and all of that, but these are things that are ready to go now.
    • Coming “Real Soon”™ – these are the things that are currently in beta/tech preview that will be important, but can’t really be used now in main production. Do not neglect these because you want to be ready when the final version is released. Ideally you would have already been playing with the beta in your lab. (You do have a lab right?)
    • Interesting – This is that catch all for things that you aren’t sure if it’ll really catch fire like Windows PowerShell did a few years ago or if it will just smolder on then eventually fade away like the Microsoft Zune.
  3. Ranking – Now take the first two lists and put them in order.
    • The first list – the one for work- should be ranked in order of fastest to deploy and adopt. It’s better to roll out something smaller and show some ROI rather than spending 6 months deploying a huge megalithic project first. It’s a snowball effect to get and maintain momentum. Not to mention that a few visible wins will help garner buy-in from other teams.
    • The second list should be in order of release. If something is due out the summer of next year you have some time to get ready. Things due out this summer, not so much.

Now that you have your items at least roughly organized, start working on the details of the first item on your Work-Currently Available list. Go back and watch the video of the sessions related that you missed. Then get a test environment going and do all of your normal “new project” steps.

Once that’s underway take the first item off your “Career – Currently Available” and start boning up. Do not neglect this list! Arguably this list of things that are currently available and relevant to your career but not your current job is the most important list. It’s the list of things you are behind on. Run it on your home lab, watch video sessions, whatever it takes but get up to speed. You don’t particularly have to be a world class expert. You should just be familiar enough so that you can confidently jump in and quickly get up to pace in the event that the tech in question jumps from the “Career” list to the “Work” list.

So just a few tips to help corral the stampede of information flooding in after any big conference. Enjoy!

The fuse for Ignite is lit!

Less than  5 days from now I’ll be winging my way to Chicago for Microsoft Ignite!
Tech-Ed last year was phenomenal and had a tremendous impact on my outlook on our industry. I can’t say enough good things about how these types of events are crucial for the IT pro in a small/medium business. This year with the combined Ignite conference, it’s even bigger and with even more things to learn and people to meet.

Let me tell you my week is BOOKED! Sunday is badge and materials pick up first thing in the morning, then on to a full day pre-conference session on Windows 10 Deployment. Sunday night I’m going to call it a night early to do some last minute review for the 70-411 Administering Windows Server 2012 exam I’m going to take instead of lunch on Monday.
The exams are half off with a free retake in 30 days so I figure what the heck right?

The rest of Monday is going to be keynotes and sessions.I’m really interested to see if there are going to be any announcements for Windows 10. The latest ‘leaked’ info is late July. Maybe Satya Nadella will give us a solid target date! Monday evening is the opening of the Expo and this year it’s tied in with the Ask the Experts so that should be interesting and fun. I have a list of people to meet. Some of them are to shake their hand and thank them for their work, some to pick their brains, and quite a few of both. The top of the list are the PowerShell.org guys and of course the Microsoft guys. This year I’m going to come out of my shell a bit and get some pictures and signatures,

Tuesday-Friday is all classes wall to wall! In most cases, I’m at least double and sometimes triple booked. That may seem strange at first but as the week progresses things may pop up that change your focus. Like for example last year PowerShell was a minor thread when I arrived in Houston for Tech-Ed but by Tuesday it was the major thing I wanted to hear more about. That caused me to be scurrying to change schedules midweek. I learned quickly that having a fallback schedule in case the room fills or your focus shifts is a good idea.

Then of course there are the happy hours, the vendor parties, the Microsoft parties, and all the social stuff going on all week. There should be no lack of things to do, food to eat and people to meet! On an interesting side note, I find our from our Marketing department at work that one of the biggest “Marketing and Design” conferences is also in Chicago that same week! So downtown Chicago should be rocking all week!

With this being a “new” conference melded from several other conferences ( Exchange, SharePoint, etc) there are some new things that are kind of cool. Like for example there is a Yammer network dedicated to the event. There is a method of publishing your schedule, which may or may not turn out to be a good thing. All in all , it seems like there is even more focus on social media and cloud this year.

So to wrap this up here’s the goals for this year:

    1. Learn more about Windows 10
    2. Learn about Office 365 migrations and ROI
    3. Learn everything they’ll tell about Windows Nano Server
Plus I have a small list of other questions to ask specific vendors.
So I’ve got a big week next week. That makes this week all cram and prep and make sure everything at work is locked down so I’m not distracted by problems back at the office.

Once more with feeling…..

OK so I have tried the whole blogging thing before.
Truth be told, my heart wasn’t in it then. I lacked a bit of confidence that I had anything to contribute to the greater voice of the Internet.
Then I realized that it really doesn’t matter. Not like I’m taking up space that someone else could use. Storage is cheap comparatively speaking and the only way to improve is practice. So here we go!