Get-Command aka Discovering PowerShell

What’s the PowerShell command for…?” This happens to all of us. We’re working on a task and hit that speedbump where we either have forgotten or just don’t know the command to do that thing. This is where it’s very easy to turn to Google or Bing and and start searching. There is an easier way!

Really the issue with search engine results is that they may be incomplete, for a different version of PowerShell, or you just can’t find the answer. Whether it’s not using the right keywords, date constrictions, or whatever, the better way is right in front of you.

You have a PowerShell window open right? That’s why you are looking for commands right? Try this:


That’s nice, but on my machine ( with a few modules installed, granted) I get over 6,500 commands that way!
That’s not helpful without some limiting parameters. I do assume that you have an idea about what you are trying to accomplish. This is where the -noun and -verb parameters come in handy.

Let’s take a random example to walk through: What is the command to determine what services are running on a computer?
Our plain English request has verbs and nouns and PowerShell commands have verbs and nouns.
With a little knowledge of PowerShell structure, it’s pretty easy to work through.

Quick refresh: PowerShell commands (cmdlets) are structured in a “Verb-Noun” syntax.
Of all the verbs in the English language, there are (at this writing) 100 approved verbs. You can see the list at Approved Verbs for PowerShell Commands. 

That’s a lot but we can narrow those down pretty quickly by remembering a few things – Common verbs are things like Add, Clear, Close, Copy, Enter, Exit, Find, Get, Hide etc. We recognize ‘Get’ from other things like  GET-COMMAND! ‘Find’ also looks attractive in this case, so let’s have a quick look at that.

PS> Get-command -verb find

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Function        Find-Certificate                             xCertificate
Function        Find-Command                                       2.2.5      PowerShellGet
Function        Find-Command                                 PowerShellGet
Function        Find-DSCResource                                   2.2.5      PowerShellGet
Function        Find-DscResource                             PowerShellGet
Function        Find-IpamFreeAddress                         IpamServer
Function        Find-IpamFreeRange                           IpamServer
Function        Find-IpamFreeSubnet                          IpamServer
Function        Find-Module                                        2.2.5      PowerShellGet
Function        Find-Module                                  PowerShellGet
Function        Find-NetIPsecRule                            NetSecurity
Function        Find-NetRoute                                NetTCPIP
Function        Find-RoleCapability                                2.2.5      PowerShellGet
Function        Find-RoleCapability                          PowerShellGet
Function        Find-Script                                        2.2.5      PowerShellGet
Function        Find-Script                                  PowerShellGet
Cmdlet          Find-Package                                       1.4.7      PackageManagement
Cmdlet          Find-PackageProvider                               1.4.7      PackageManagement

Ok that’s better but none of those seem to have anything to do with services. Let’s come at it from the ‘service’ side. You can list all the command with a certain noun or noun part using the -noun parameter.

PS> get-command -noun service

CommandType     Name                                     Version    Source
-----------     ----                                     -------    ------
Cmdlet          Get-Service                        Microsoft.PowerShell.Management
Cmdlet          New-Service                        Microsoft.PowerShell.Management
Cmdlet          Remove-Service                     Microsoft.PowerShell.Management
Cmdlet          Restart-Service                    Microsoft.PowerShell.Management
Cmdlet          Resume-Service                     Microsoft.PowerShell.Management
Cmdlet          Set-Service                        Microsoft.PowerShell.Management
Cmdlet          Start-Service                      Microsoft.PowerShell.Management
Cmdlet          Stop-Service                       Microsoft.PowerShell.Management
Cmdlet          Suspend-Service                    Microsoft.PowerShell.Management

Aha! That first one Get-Service looks good! Let’s try that one.

PS> Get-Service

Status   Name               DisplayName
------   ----               -----------
Running  AarSvc_11d5ae      Agent Activation Runtime_11d5ae
Running  AdobeARMservice    Adobe Acrobat Update Service
Running  AdobeUpdateService AdobeUpdateService
Running  AGMService         Adobe Genuine Monitor Service
Running  AGSService         Adobe Genuine Software Integrity Serv…

Bingo! Of course I abbreviated that list. There are 299 services on this machine right now. Perhaps we should filter that down a bit. Parameters are a good way to do just that. You can find the parameters using the Get-Help cmdlet.

PS> get-help get-service


    Get-Service [[-Name] <string[]>] [-DependentServices] [-RequiredServices]
    [-Include <string[]>] [-Exclude <string[]>] [<CommonParameters>]

    Get-Service -DisplayName <string[]> [-DependentServices]
    [-RequiredServices] [-Include <string[]>] [-Exclude <string[]>]

    Get-Service [-DependentServices] [-RequiredServices] [-Include
    <string[]>] [-Exclude <string[]>] [-InputObject <ServiceController[]>]


Let’s say that the service we are interested in is something to do with printers. We can use wildcards in the -DisplayName parameter to narrow this down

PS > Get-Service -DisplayName *print*

Status   Name               DisplayName
------   ----               -----------
Running  DLPWD              Dell Printer Status Watcher
Running  DLSDB              Dell Printer Status Database
Stopped  PrintNotify        Printer Extensions and Notifications
Stopped  PrintWorkflowUser… PrintWorkflow_11d5ae
Running  Spooler            Print Spooler

Of course you can use the PowerShell pipeline to do Where-Object filtering, start/stop/restart the services discovered, but all of that is another story for another time!

Cybersecurity Executive Briefing – Why Defense in Depth

This started as an email to a few people but as the word count climbed, I started thinking that there may be other SMB IT folks out there trying to explain cybersecurity to Executive teams. I am NOT a security ‘expert’, but I do keep as up to date as possible. I hope this helps.

There are two key concepts in cyber-security today – Defense in Depth and Assumed Breach.  To understand them fully is an entire career of knowledge and research but we can boil it down to a few base principles.

Back in the Olden Days,  there were no smartphones, a few laptops, and the main concern was Keep the Bad Guys Out.  We installed firewalls to block access into our networks, some filters for email viruses and called it done.

Alas, the villainous hordes were not so easily defeated! The increasingly mobile user started being a Typhoid Mary, carrying in bad things from the outside.  Since it’s on a ‘trusted’ device, it’s welcomed into the ‘safe’ network.


Like a sick child brings home colds and flu, these mobile devices coming back from the outside world brought back various malicious software ( MALWARE ).  Some were like viruses and spread from machine to machine by infecting documents or programs. Some were worms that moved through the network by tunneling around the files and folders accessible to a “trusted machine”. Some simply downloaded remote control software and awaited instructions as part of a robot network of attackers (BOTNET). All of this happens inside the ‘safe’ perimeter.

The way to defend against this is a posture of “Assumed Breach”. Stated simply, this is acknowledging that at some point an infected machine/file will be on your network. Now what? Detections and Containment are usually the first steps. This is done by implementing machine level firewalls, restricting network shares, routine backups and scans.

The best practices of today all recommend a ‘Defense in Depth’ strategy. Conceptually this is pretty straightforward, but a little trickier to implement. The base idea is like a soldier in a hostile country.

  • At the most personal level, our trooper has a helmet and body armor. On your laptop, this is limiting administrative authorities, disk encryption, and anti-malware software.
  • Next out are the protections for the group. For our troopers, it’s walls and gates with guards. Your network has a firewall with intrusion detection and filters to keep the bad stuff out.
  • Finally, there is air support. For our trooper, this may be a group of fighter jets flying around to detect and intercept incoming ‘bogeys’. For our network, this is a cloud-based filter that stops problems before they even get to the perimeter.

The issue with many corporate solutions goes back to the whole “mobile” concept. When our troopers are out in the jungles of the cyber world (aka the Airport Lounge), they don’t have the group protection of the firewall layer. With no cloud-based systems, it’s down to just body armor or what security they have locally installed.

Just like how our troops have more advanced body armor than their great-grandfathers in World War II, advances in weapons require advances in protection. It’s an arms race between the security experts and the bad actors, and it’s a fast-moving one.

Keeping up is difficult even for large companies with dedicated security staff, and it’s exponentially more difficult for smaller companies where the technical staff are forced to wear more hats. That’s why SaaS ( Software as a Service) offerings are so attractive. Most well reviewed SaaS products are very good at early detection and protection from outside attacks. The best will also prevent a botnet compromised machine from ‘reporting back’ for instructions. A few thousand per year is still cheaper than a very public breach or hiring a dedicated security person! Some Executives will have doubts if it’s worth it to have all this security. Sure, responsible leadership has a cybersecurity clause on their business insurance. Even with multiple layers and backups, you may still get hit. To switch analogies a bit, we all have a theft clause on our homeowner/renter insurance, right? We also still lock our doors when we leave.

So what’s the answer?  What is the solution that fixes all of this so we can move on with business? Sadly there is a no one size fits all solution. The answer is ‘It Depends”.  What is your budget? What is your tolerance for risk? A very small business with few digital files would likely be ok with a daily or weekly backup to a drive stored off line. You could call that 1.5 layers. Larger businesses with a larger digital presence and less tolerance for loss of files and capability will need more protection. 


Packing up for Ignite!

Everyone has their own opinions about what to take when packing for a conference. Obviously a lot of the conferences are smaller than Ignite and that makes for a slightly different packing list.

Here are a few of the things on my checklist :

1. Two pairs of shoes. It seems silly, but I can vouch for the fact that swapping shoes every day makes a difference. I for one usually don’t walk 17,000 steps in a normal day. However, at Ignite, a 17k day is not unusual for me. Good, well broken in, comfortable shoes are invaluable.

2. Powerbanks. Those little USB power chargers will keep your phone juiced up all day. Power plugs can be scarce at times. Having a big battery to tide you over can be the difference between being able to tap out a quick note VS having to write it down in a pad.

3. A notebook. As much as you try, you may run out of juice and need to take notes the old fashioned way. I prefer the half size spirals. Usually there are some vendors in the Expo/Hub giving away branded notepads as well.

4. Light jacket/windbreaker. The temperature can swing wildly from hot outside and warm in the halls to chilly in the rooms for breakouts. I like to take a light warmup jacket that will fit easily in my pack.

5. USB multiport charger. I have a little USB charger with 5 ports that plugs into a regular wall plug. This is for your hotel room to charge all of your stuff at the same time. There are never enough plugs accessible in hotel rooms.

6. Hand sanitizer. You will be shaking hands a lot. Nuff said.

7. Business cards. If your company does not provide them, most Office Depots will make you a set of 50 or 100 inexpensively and ready in a couple of hours.

8. Extra space. You will be bringing back t-shirts and other swag. Even if you don’t want the stuff, it’s nice to bring it back to your team. Either only pack your suitcase 2/3 full or bring an empty gym bag in your main suitcase. It’s amazing how many t-shirts can be stuffed in a decent size gym bag!

Of course, some of these seem obvious, but I personally have forgotten at least a few of these things! Trust me, it’s much cheaper to bring these things than to buy at the airport or hotel.

Other than that, eat and drink responsibly, stay hydrated, and have fun. If you tend to be a little introverted, step out of your comfort zone and introduce yourself to at least one other attendee per day. Vendors do not count since they are working at meeting YOU!

That about sums it up for now. I’m going to try to post more during the week, time permitting.

Powershell Tuesday Quick Tip #7

Back at it again!

This tip is a little snippet of code for building Dynamic distribution lists for something other than Organizational Units or Branch offices or the other examples. This one is to build a list of Managers.

How does one differentiate managers in Active Directory? Title? no because at least in our organization, the management team have different titles. Things like ‘Vice President’, ‘Director’,or ‘Controller’ in addition to the more mundane ‘Manager’. This is complicated by the fact that we refer to our outside salesforce as ‘Territory Managers’. So the Title property is right out.

What is the one other property that defines a member of a leadership/management team? Yep – they have someone reporting to them. Enter the “Direct Reports” property!

Team that up with a New-DynamicDistributionGroup Cmdlet and you get a new example like this:

New-DynamicDistributionGroup -DisplayName 'My Company Managers' `
 -Name MCManagers `
 -PrimarySmtpAddress '' `
 -RecipientFilter { DirectReports -ne $null}

*Standard disclaimer about minding the backtick line continuations. This is actually a one liner but considering the important part is at the end, this format is easier to read.

The result is a list of anyone in your organization that has someone reporting to them. Exactly what the HR folks want when they ask for a mailing list of all the managers!

Manage your Stack with WAC and OpenManage

In a previous post, I briefly mentioned managing your Azure Stack HCI with Windows Admin Center (WAC). When you are using Dell hardware, it’s even better with OpenManage!

Imagine being able to securely control your server stack from any browser? Even if you don’t allow access from outside the network, imagine how nice it would be to be able to have a ‘single pane of glass’ to manage most of your day to day things. Now imagine that it’s free…

Setting up

First you have a decision to make. Do you want to run WAC from your pc or as a service on a server? I like to run the latest General Availability release on a server for our whole team to use, and the latest Insider build on my laptop for testing and comparison.

So how do you get it ? Simply go to the Windows Admin Center web site and download the MSI file. It’s the same install whether you are installing on Windows 10 or Windows Server, so that’s easy.

Before you install, obviously you have to make the decision on the ‘type’ of install you want. There are four main types of installations ranging from the super simple ‘Local Client’ install all the way up to a failover cluster of management servers. Here are the differences.

First is the simplest of installations – the Local Client. This is just you install on your Windows 10 pc that has network connectivity to all of the servers to be managed. This is fine for small, single administrator environments or as I noted above, for testing Insider builds.

Next is the ‘Gateway’ version. It’s installed on a designated Windows 2016 or higher server and runs as a service. This is the best one for access by a small team of admins.

Our third option is worth noting as it is a supported and documented configuration. It’s called ‘Managed Server’. In this scenario, you install the software directly on one of the servers to be managed. I don’t recommend it unless you have some reason not to create a dedicated “gateway” VM. Of course I’m a fan of single purpose servers, so your situation may vary.

The fourth and final option for a LAN installation is a failover cluster. Truthfully if you are in a large environment – this is certainly the way to go. It give you high availability and reliability for your management tools. In a small or medium business, at this time, it’s a little overkill.

My deployment is a bit of a hybrid. I’m running a gateway server on a VM sitting atop a Azure Stack HCI cluster for the team to use. So it’s highly available enough for our needs.

One of the big advantages to WAC over traditional management tools is the extensibility. There is a Windows Admin Center SDK along with examples on Github if you want to write your own. However there are several ready-made extensions provided by Microsoft and a few hardware vendors. Microsoft extensions are things like DNS, DHCP and Active Directory. The real benefit is if your hardware vendor provides an extension.

For example – as noted in my Azure Stack HCI post – I’m using Dell hardware. The Dell EMC OpenManage extension lets me use Windows Admin Center to not only manage my software environmant but my server hardware as well! Disk Health, Temperatures, Fan Speed, and more all in one handy to use dashboard.

Azure Stack with Only Two Servers

Ok, so that title is a teeny bit of an overstatement. What we’re going to discuss today is not the Azure Stack you have heard so much about. No, today we’re talking about an Azure Stack HCI Cluster.

Essentially, it’s a Storage Spaces Direct cluster with brand new shiny marketing and management. While you can use the old “*.msc” tools, it’s a far, far better thing to use the new Windows Administrator Center (WAC)! More on that soon.

I’m not going to dig into the details of how S2D works or the fine details of building clusters large and small. Instead, I want to share with you some of the reasons why small and medium businesses need to pay attention to this technology pronto.

  1. Scalability: Sure, most of the documentation you find today focuses on building clusters of 4-6 nodes or more. That’s great unless you are a small business that doesn’t need that kind of compute and storage. That’s where the two-node “back to back” cluster comes in. Think “entry-level”. The beauty of this solution is scalability. If you do outgrow this, you buy a new node, possibly a new network switch and you are up to a 3-node cluster!
  2. Compact: I had two old servers that took up two Rack Units (RU) of space each plus an external drive array that took another 2 RU. That totaled up to 6 Rack Units or 10.5 inches of rack space. The new servers that replaced them are 1 Rack Unit each for a total of 3.5 inches! That doesn’t even touch on noise, heat and power consumption.
  3. Fast and easy: Ok yes, it is a complicated technology. However, you can follow Dell’s Azure Stack HCI Deployment guide and you’ll be 90% of the way there. It’s a mouthful, but it’s 40 pages of step by step instructions and checklists. *TIP* I’ve included some tips below for a couple of places that aren’t super clear (or at least to me)
  4. Well Documented: If you are like me and want to understand how this all works before you bet your mission critical VMs to it, there is a ton of good information out there. Here are some options depending on your preferred method.

    • The Book: Dave Kawula’sMaster Storage Spaces Direct.  It’s over 300 pages of detailed explanation for a very reasonable price. Although you can get it free, those people spent a lot of time working on that so pay something. You’ll see what I mean on the Leanpub site.
    • The Video: Pluralsight’s Greg Shields has a course on Storage Spaces Direct that is 4 hours of in-depth instruction on Windows Failover Clusters and Storage Spaces Direct in Windows 2019. If you aren’t a subscriber to Pluralsight, they offer trial periods!
    • The ‘Official’ Documentation: Microsoft’s Storage Spaces Direct Overview is the place for the official documentation.

There are a few tips and gotchas that I want to share from my experience.
First, the hardware. These servers aren’t crippling expensive, but they certainly aren’t disposable cheap either. This means there are a lot of critical hardware decisions to make. What’s more important to your organization – budget or speed or a balance? On the one end, you can go all-flash storage which will give you a fast system, but the price tag goes up fast too. The less expensive but slightly more complicated set up is a hybrid of SSD and HDD storage. Making certain that you have the right mix of memory, storage and network adapters can be a daunting task.

Honing a specification and shopping it around to get the perfect balance of form, function, and price is great if you are a hobbyist or plan on building out a ton of these at one time.

However, for IT Admins in most small companies, the more important thing is that it gets done quick, quiet and correctly. The business owners don’t care about the fine nuances. They want to start realizing the business benefits of the new technology.

I chose a much more economical option, both in time and cash. I searched Dell’s website for “Microsoft Storage Spaces Direct Ready Nodes” and picked out a pair of their “Ready Nodes” that looked like they matched my budget and needs.

Then it was a small matter of reaching out to my Dell team. My sales rep put me in touch with their server/storage specialist. He asked a couple of questions about workloads, storage, and networking. Presto, we had our servers on order.

*Pro-tip* Buy the fiber patch cables at the same time. They are no less expensive elsewhere and you have less chance of getting crap cables.

If you don’t already have a relationship with Dell, there are several other Microsoft certified hardware vendors. There is a list here:

 Tips for the build

You’ve only got 2 servers to configure, so typing each command is feasible. However, in the interest of both documenting what I did and saving on silly typos, I opened up VSCode and wrote up all the Powershell commands described in the Dell docs.

The steps (much simplified) look something like:

  1. Install OS if not preinstalled and then patch to current.
  2. Plan and Map out IP networking. Use the checklists in the back of the Dell guide to map out your IP scheme. It’s a time saver!
  3. Pre-stage the Active Directory accounts for the Cluster as documented here. Trust me, it’s faster to do it on the front side than after the fact.
  4. Install Windows Features on each node
  5. Virtual Switch and Network configuration on each node
  6. Test-Cluster and Review/Remediation
  7. Create Cluster – with no storage
  8. Enable Storage Spaces Direct on the cluster.
  9. Configure a witness – either another server on the site or a cloud witness.
  10. Create virtual disks – Cluster creation and enabling Storage Spaces Direct on the cluster creates only a storage pool and does not provision any virtual disks in the storage pool.

Next time, we’ll go into depth on the management of our newly built Azure Stack HCI setup!

Convert DHCP Lease to Reservation with Powershell

Scenario :  You have just created (by whatever means) a new server. It has gotten an IP address from DHCP and all is working as it should…..However…. Perhaps this server provides a service that requires a static IP (for whatever reason).

In the old days, you would fire up the DHCP tool from the RSAT tools or <shudder> RDP into your domain controller to convert the dynamic DHCP Lease into a reservation. There is a faster way…… if you guessed Powershell – you are correct 🙂

It took a little spelunking around to work this out but here are the steps to follow.

First, create a remote powershell session to your DHCP server.

 Enter-PSSession -ComputerName MyDC.houndtech.pri 

Next find out the IP address  of the new server. Easy enough with :

Test-NetConnection -computername Newserver.houndtech.pri 

Let’s make things a little easier for later and put that returned object in a variable.

$x = Test-NetConnection -computername Newserver.houndtech.pri  

But we don’t need the whole object, just the IP address, so let’s narrow it down with

$IP = $x.BasicNameResolution.IPaddress  

Now $IP contains JUST the IP address, which is what we need for the next series of cmdlets.
Next we retrieve the DHCP lease object with

 Get-DHCPServerV4Lease -IPAddress $IP 

Finally pipe it to the cmdlet to add the reservation.

 Get-DHCPServerV4Lease -IPAddress $IP | Add-DHCPServerV4Reservation 

Of course this could be piped together into a sweet oneliner that would go well added to any automated provisioning script.

 Get-DHCPServerV4Lease -IPAddress ((Test-NetConnection -ComputerName 'NewServer.HoundTech.pri').RemoteAddress.IPAddressToString)| Add-DhcpServerv4Reservation 

See you next time!

Powershell and Devops Summit 2018

Last week was the big Powershell/Devops Summit in Bellevue, WA. I say “big” not as in ginormous 15,000 attendee extravaganzas like Ignite or VMWorld. No, this 365 attendee Summit was big as in the stature of the people there. All the Powershell superstars were there, sharing their knowledge and enthusiastically pushing the rest of us to excel.

This was my first Summit, and although I have waded into the Powershell Community pool, this was a dive into the deep end! Happily, I managed to keep up and learn quite a few things that I can immediately apply. I also brought back copious notes on things to try out in the old Lab.

It would be a novella to describe all the things I learned, but here are a few highlights and key takeaways.

  • Powershell 6.x is the way forward. Cross-platform and lightweight, it will run on almost anything. There was a demo of a Raspberry Pi with Powershell 6 installed and sending sensor information ( heat – humidity sensor) and controlling an attached light. Pretty nifty. Also Cloud Shell (Azure Powershell in a browser) either runs now or will soon run v6.
  • To utilize old modules, soon to be released: Windows Powershell Compatibility Pack. This is a clever solution. It allows you to essentially remote into your own PC’s Windows Powershell (5.1) session. It’s a little confusing until you remember that Powershell 6 and Powershell 5.1 are different executables and can and do run side by side.
  • Powershell Classes + REST API’s = Super functions. In a nutshell, use  classes to build objects out of data returned from Get-RestMethod. Once it’s a fully fleshed out PSObject, you have many more options on how to interact with that data. Powershell is all about objects afterall. For more info: Tweet to: Jeremy Murrah or check out his presentation on GitHub
  • Desired State Configuration Pull Server is becoming more of a community/open source project. It appears (and maybe I misunderstood) that Microsoft isn’t doing much development on the Pull Server portion of DSC. They are focusing on the Local Configuration Manager (LCM). This makes a lot of sense, it’s easy and not expensive to use Azure Automation as your pull server. There are also a few other Open Source Pull servers like TUG
  • Lean Coffee! For a completely not  Powershell side session, Glenn Sarti introduced a few of us to ‘Lean Coffee’. It’s not a skinny half-caf soy latte, it’s a way to organize small informal meetings. Briefly, everyone gets Post-Notes ( or other slips of paper) and writes down 2-3 things they want the group to discuss at this meeting. Everyone votes on what they want to discuss and that determines the order of conversation. Someone acts as ‘timekeeper’, and every 2 minutes polls the group for a simple “thumbs up/down” vote. If the majority votes UP , the conversation stays on that topic, otherwise, you move on to the next highest voted topic. Repeat until time or topics runs out. I am definitely going to try this in our next team meeting! For more info – check out :
  • I learned a LOT about CI/CD pipelines for Powershell using VSTS and a few other tools from both and in separate sessions. This topic needs a blog post or 6 all it’s own. Two things to remember – 1. it can start simple and build from there  and 2. Plaster frameworks make the dream work.
  • Thomas Rayner showed us how to make custom rules for PSScriptAnalyzer. Time to make some “house rules”!

There was a LOT more that I have noted to study up on and try out, those will be later posts I’m sure.

Of course, Jeffrey Snover  continues to amaze me with his enthusiasm and optimism. Some great ‘Snover-isms” heard: “Line noise should not compile!” and “Like cockroaches in a kitchen full of cowboys”.

Aside from actual sessions, there were several interesting conversations not only with superstars like Don Jones™ and Mark Minasi, but also with the “next-gen” stars like Michael Bender and James Petty and with regular Powershell guys like me.

As for the actual event, the folks that put this together are top-notch. They care about the experience we have and it shows. Very good food, opportunities to socialize, and a refreshing lack of hard sell vendors. One of the few conferences I go to that doesn’t result in an email flood in the week following.

Bellevue was great – it only rained 4 of the 5 days I was there! Seriously though, the rain wasn’t an issue – even though I walked everywhere except to and from the airport. A light misting drizzly rain wasn’t horrible, and the temperatures were great. Cool enough so that you could vigorously walk up hills without getting sweat soaked and warm enough that only a light jacket was needed.


Powershell Tuesday Quick Tip #6

This week’s tip is a bit longer – it’s actually a little script to remotely create a shortcut to a network file share on a user’s desktop. This one comes in handy when explaining how to make a shortcut over the phone proves …. difficult.

$PC = ''
$User = 'LSkywalker'
$TargetPath = "\\\death_star_plans\"
$ShortcutFile="\\$PC\C$\Users\$user\Desktop\Death Star.lnk"
$Obj = New-Object -COMObject Wscript.Shell
$Shortcut = $Obj.CreateShortcut($ShortcutFile)
$Shortcut.TargetPath = $TargetPath

This one is a good candidate for getting wrapped up in a function and added to an “Admin Toolkit Module”.

Powershell Tuesday Quick Tip #5

Here’s another one from the easy but annoying to rethink stack.
People, as someone once said, are ‘squishy and move around a lot’. Not only do they move but they change and sometimes the change involves their names.
For example, the customer service person gets married or the sales manager gets divorced.(Hopefully not for the same reason – but that’s a different blog)

So you can either fire up the Active Directory tool of your choice (ADAC or ADUC) OR switch over to your PowerShell console that you have open already.
… you DO have it open already right?

Set-AdUser -Identity Leia -Surname 'Solo' -DisplayName "Leia Solo"
Set-Mailbox -Identity Leia -PrimarySmtpAddress '' -EmailAddressPolicyEnabled:$False -DisplayName 'Leia Solo' -EmailAddresses @{add =''}

The first line “Set-ADUser” changes Leia’s surname to Solo and corrects her display name. The second line, “Set-Mailbox” changes her primary email address to “” along with her display name while adding her original email address back in as an alternate. This is so Leia will continue to get email from some people that haven’t gotten the happy news yet.

These two lines require that you have the Active Directory module and a remote session to either Office365 or your Exchange server.

Next week I’ll be blogging from the PowerShell + DevOps Global Summit in beautiful Bellevue WA!