Category Archives: Uncategorized

Get-Command aka Discovering PowerShell

What’s the PowerShell command for…?” This happens to all of us. We’re working on a task and hit that speedbump where we either have forgotten or just don’t know the command to do that thing. This is where it’s very easy to turn to Google or Bing and and start searching. There is an easier way!

Really the issue with search engine results is that they may be incomplete, for a different version of PowerShell, or you just can’t find the answer. Whether it’s not using the right keywords, date constrictions, or whatever, the better way is right in front of you.

You have a PowerShell window open right? That’s why you are looking for commands right? Try this:

Get-Command

That’s nice, but on my machine ( with a few modules installed, granted) I get over 6,500 commands that way!
That’s not helpful without some limiting parameters. I do assume that you have an idea about what you are trying to accomplish. This is where the -noun and -verb parameters come in handy.

Let’s take a random example to walk through: What is the command to determine what services are running on a computer?
Our plain English request has verbs and nouns and PowerShell commands have verbs and nouns.
With a little knowledge of PowerShell structure, it’s pretty easy to work through.

Quick refresh: PowerShell commands (cmdlets) are structured in a “Verb-Noun” syntax.
Of all the verbs in the English language, there are (at this writing) 100 approved verbs. You can see the list at Approved Verbs for PowerShell Commands. 

That’s a lot but we can narrow those down pretty quickly by remembering a few things – Common verbs are things like Add, Clear, Close, Copy, Enter, Exit, Find, Get, Hide etc. We recognize ‘Get’ from other things like  GET-COMMAND! ‘Find’ also looks attractive in this case, so let’s have a quick look at that.

PS> Get-command -verb find

CommandType     Name                                               Version    Source
-----------     ----                                               -------    ------
Function        Find-Certificate                                   2.5.0.0    xCertificate
Function        Find-Command                                       2.2.5      PowerShellGet
Function        Find-Command                                       1.0.0.1    PowerShellGet
Function        Find-DSCResource                                   2.2.5      PowerShellGet
Function        Find-DscResource                                   1.0.0.1    PowerShellGet
Function        Find-IpamFreeAddress                               2.0.0.0    IpamServer
Function        Find-IpamFreeRange                                 2.0.0.0    IpamServer
Function        Find-IpamFreeSubnet                                2.0.0.0    IpamServer
Function        Find-Module                                        2.2.5      PowerShellGet
Function        Find-Module                                        1.0.0.1    PowerShellGet
Function        Find-NetIPsecRule                                  2.0.0.0    NetSecurity
Function        Find-NetRoute                                      1.0.0.0    NetTCPIP
Function        Find-RoleCapability                                2.2.5      PowerShellGet
Function        Find-RoleCapability                                1.0.0.1    PowerShellGet
Function        Find-Script                                        2.2.5      PowerShellGet
Function        Find-Script                                        1.0.0.1    PowerShellGet
Cmdlet          Find-Package                                       1.4.7      PackageManagement
Cmdlet          Find-PackageProvider                               1.4.7      PackageManagement

Ok that’s better but none of those seem to have anything to do with services. Let’s come at it from the ‘service’ side. You can list all the command with a certain noun or noun part using the -noun parameter.

PS> get-command -noun service

CommandType     Name                                     Version    Source
-----------     ----                                     -------    ------
Cmdlet          Get-Service                              7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          New-Service                              7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Remove-Service                           7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Restart-Service                          7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Resume-Service                           7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Set-Service                              7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Start-Service                            7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Stop-Service                             7.0.0.0    Microsoft.PowerShell.Management
Cmdlet          Suspend-Service                          7.0.0.0    Microsoft.PowerShell.Management

Aha! That first one Get-Service looks good! Let’s try that one.

PS> Get-Service

Status   Name               DisplayName
------   ----               -----------
Running  AarSvc_11d5ae      Agent Activation Runtime_11d5ae
Running  AdobeARMservice    Adobe Acrobat Update Service
Running  AdobeUpdateService AdobeUpdateService
Running  AGMService         Adobe Genuine Monitor Service
Running  AGSService         Adobe Genuine Software Integrity Serv…

Bingo! Of course I abbreviated that list. There are 299 services on this machine right now. Perhaps we should filter that down a bit. Parameters are a good way to do just that. You can find the parameters using the Get-Help cmdlet.

PS> get-help get-service

NAME
    Get-Service

SYNTAX
    Get-Service [[-Name] <string[]>] [-DependentServices] [-RequiredServices]
    [-Include <string[]>] [-Exclude <string[]>] [<CommonParameters>]

    Get-Service -DisplayName <string[]> [-DependentServices]
    [-RequiredServices] [-Include <string[]>] [-Exclude <string[]>]
    [<CommonParameters>]

    Get-Service [-DependentServices] [-RequiredServices] [-Include
    <string[]>] [-Exclude <string[]>] [-InputObject <ServiceController[]>]
    [<CommonParameters>]


ALIASES
    gsv

Let’s say that the service we are interested in is something to do with printers. We can use wildcards in the -DisplayName parameter to narrow this down

PS > Get-Service -DisplayName *print*

Status   Name               DisplayName
------   ----               -----------
Running  DLPWD              Dell Printer Status Watcher
Running  DLSDB              Dell Printer Status Database
Stopped  PrintNotify        Printer Extensions and Notifications
Stopped  PrintWorkflowUser… PrintWorkflow_11d5ae
Running  Spooler            Print Spooler

Of course you can use the PowerShell pipeline to do Where-Object filtering, start/stop/restart the services discovered, but all of that is another story for another time!

Powershell Tuesday Quick Tip #7

Back at it again!

This tip is a little snippet of code for building Dynamic distribution lists for something other than Organizational Units or Branch offices or the other examples. This one is to build a list of Managers.

How does one differentiate managers in Active Directory? Title? no because at least in our organization, the management team have different titles. Things like ‘Vice President’, ‘Director’,or ‘Controller’ in addition to the more mundane ‘Manager’. This is complicated by the fact that we refer to our outside salesforce as ‘Territory Managers’. So the Title property is right out.

What is the one other property that defines a member of a leadership/management team? Yep – they have someone reporting to them. Enter the “Direct Reports” property!

Team that up with a New-DynamicDistributionGroup Cmdlet and you get a new example like this:

New-DynamicDistributionGroup -DisplayName 'My Company Managers' `
 -Name MCManagers `
 -PrimarySmtpAddress 'mcmanagers@mycompany.com' `
 -RecipientFilter { DirectReports -ne $null}

*Standard disclaimer about minding the backtick line continuations. This is actually a one liner but considering the important part is at the end, this format is easier to read.

The result is a list of anyone in your organization that has someone reporting to them. Exactly what the HR folks want when they ask for a mailing list of all the managers!

How a cloud service pays for itself in one emergency

Imagine this scenario: It’s Friday evening and a storm that was going to hit a nearby town veers off toward your corporate HQ. Suddenly you are looking at torrential rains and flooding of biblical proportions at your corporate HQ. Oh, and to top it off it’s your major stocking location for your national distribution business.

We weren’t imagining this when Hurricane Harvey dropped trillions of gallons of rain on the Houston area. Of all the Disaster Recovery and Business Continuity plans the one that was so very very simple was email and communication. We had already transitioned all of that to Office 365. What was our DR/BC plan for email and other written communication? We didn’t need to worry about the service.  We simply made sure that key people had laptops and good to go!

Of course, all wasn’t peaches and roses.

We have a couple of areas in the business that were still using on-premises file servers. The users trying to access those servers had to use a VPN (an uncommon occurrence for many of them). That was awkward and strained our licensed capacity for VPN connections, but it wasn’t a major ordeal. Guess who is going to be pushed toward SharePoint Online Document Sites? You betcha!

The other pain point was our legacy IBM PowerSystem. None of our locations lost power/connectivity during the storm, but we watched it like a hawk to determine if we should failover to the DR site. If that system was in a cloud bank in Arizona, there is less chance of hurricane issues.

True enough, cloud-based systems have their own set of problems. However, if you go with a geo-diverse plan, the likelihood of a big storm/blizzard/earthquake halting your business are greatly reduced. This is in addition to all the typically touted benefits of OPEX vs CAPEX, hardware maintenance and all that.

The moral of the story: If you work in a hurricane zone – pay attention to the cloud.

 

 

Update Status

Once again life has bitten me in my best intentions.

I promised myself and both my readers that this blog was going to be a regular thing, and in the beginning I did OK.  Then life happened as usual. Between work projects and an increasing feeling of pressure to finish my MS Certifications, blogging fell by the wayside.  This has been somewhat exacerbated by my agonizing over every word going on the screen.

“What if I put out something wrong?”  ” What if I look dumb?”

I had an interesting conversation with a gentleman that I admire and respect who told me to stop agonizing and DO IT.  If there is a mistake, own it, fix it and move on. Spell Check, Grammar Check,  POST he told me.

Later on at TechMentor Don Jones and Jason Helmick talked about how people are slowed by a fear of failure. The “what if..” consumes them into paralysis. To grow, to move forward, we have to get over that. Conquer it.

That’s what blogging was like for me. I was foolishly trying to make every entry a Pulitzer prize type thing. Yeah, ridiculous I know. Especially when I have more fingers than readers.

So this is me “getting over it”.  I’ll post much more often. Things may not be perfect, but they’ll be honest and as accurate as I can be.

Here’s to the ride.

And now for something completely different………..

So I had the cool project going for the email account scripting and all of that mentioned in the previous post.

But then the inevitable happened. Pulled from that project to work on this more important project, pulled from that second project to work on putting out this fire. Blah blah blah the story of an IT Pro in a smaller company. It’s both my greatest joy and largest pain, so rather than a long pause between posts, I thought I’d dash out a little blog-blurb  about what’s going on.

The major project that pulled my attention away from the Exchange reporting project is actually related.  While I was working on that and sharing some preliminary results with some senior management folks, they say “ bah those people don’t need email – just delete them”.

Of course that throws a monkey wrench into the employee onboarding script I had written some time ago. See at that point, the policy was ‘Everyone gets a mailbox’ and I was a bit of a larval scripter, certainly not quite a toolmaker. Of course the result of that was a one size fits all script that had a lot of assumptions and not much reusability. So with the change in policy, I took one look at the old script and said  OH HELL NO! Time to refactor.

So what was a 200 line script that did several things all intertwined is in the process of becoming a full fledged module with six Advanced Functions plus a ‘controller’ script to tie them all together.

The bad news is that put my Exchange/Archive joining project on hold.

The good news is that this refactored User Profile Tools module will save us a lot of time, confusion and headache. Bonus round,I’ll have another subject for this blog  Smile

Until Next Time…………………

Exchange and Archive PST Blues

This may draw the ire of some big environment silo Exchange Admins, but our environment gets very little “administration”. Oh there’s patches done regularly, users added and removed, but that’s about it. I keep an eye on Event Logs for errors and disk space to make sure we’re not running out and make sure the DAGs are replicating. Done – end of list. There are simply too many other irons in the fire to deal with small “wouldn’t it be nice” things.

One of those things on the One Day List is the .PST files resulting from Outlook Auto-Archive. Back in the old days, we had to use archiving to keep file sizes down for performance reasons. However these days with newer versions of Outlook, setting the sync time on your cache keeps the amount of data loaded into Outlook to a manageable level. This, and the relative inexpensiveness of storage has made the archive unnecessary in our environment at least.

Just because the archive process isn’t needed doesn’t mean we can just seek and destroy all of the .PST files all over the place. No, we need to find them, and import them back into the users’ mailboxes on Exchange. This is a good news/bad news sort of situation here. The good news is that we backup all of our pc’s centrally using a file copy solution so I have copies of all the PST files on a backup drive. One nice easy place to find them. The bad news? We’re working toward a migration to Office 365 and so size does matter. How many of these mailboxes are going to be over 50 GB? Any? Some? All I could do is guess and I hate to guess. I’m paid to know (or at least find out)!

This looks like a job for POWERSHELL!

Allow me to finish setting the technical scene with these little tidbits. I just started running Windows 10 Insider Preview on my production workstation, and while I’ve messed about with PowerShell 5 in a lab or at home, domain-joined production pc feels different. Secondly, we’re still on Exchange 2010 with all that entails. I long ago added implicit remote sessions to a DC and one of my Exchange servers to my PowerShell profile. So I can get-mailbox and set-aduser without having to invoke command or remember to import anything. All set. Good! Right? Not so fast.

I figure I’m going to break this project into 3 pieces.

  1. I’ll write a script to pull the sizes of all the company mailboxes. Since I have users that seem to forget how to empty the Deleted Items folder, I want to break out the result into Inbox, Sent Items and Deleted Items as well as the total size of each mailbox. I’ll output this to a CSV file called MailboxSizes.csv
  2. I’ll write another script to scan our backup files for .PST files. I’ll make note of the path and the size and output this to a file called Archives.csv
  3. Then I’ll write a third script to import these PST files into their owners’ mailboxes.

Sounds simple right? well in concept, it was simple. Execution however was a wee bit trickier. See I forgot one of the gotchas of PowerShell remoting. See the objects returned from Exchange running in a remote session are not the same objects you get running locally. For example:

PS C:\> (Get-mailboxstatistics -Identity Greg).TotalItemSize.Value.ToMB()

This should return a number of megabytes equal to the total size of my mailbox. No, not remotely. Remotely your return looks like this:

PS C:\> (Get-mailboxstatistics -Identity Greg).TotalItemSize.Value.ToMB()

You cannot call a method on a null-valued expression.

……What the heck? So using my sleuthing hat, I worked backwards. I removed the “.ToMB()” to see if Value had anything in it that perhaps didn’t like the method. Interestingly, I got nothing. There’s your null-valued expression right there, but why is it Null? Let’s back off one more step to see where the problem is hiding.

PS C:\> (Get-mailboxstatistics -Identity Greg).TotalItemSize

642.5 MB (673,721,009 bytes)

….. OK some data. But why is the value property empty?

Hmm with a head scratch I pipe it to Get-Member and light dawns. Lots of methods but only one property and that’s Length.

I see the TotalItemSize property doesn’t even have a Value property, just a length. Also it’s hard to math things with a string value.

That’s when I remembered. Exchange 2010 was an early adoptee of PowerShell but it was PowerShell v2 and remoting wasn’t as mature. So I tried a few other things, Invoke-Command thinking that since the command was running on the Exchange server it might work. Alas, it wasn’t to be. I wasn’t going to RDP onto the server, write and run a v2 script, just to copy the file back to my desk. That would defeat the ‘automation’ part of this exercise. We have people coming and going every week. This reporting was going to be done several times! So after noodling around an hour or more trying to work around this deserialization problem, I decided to take the lemon and make lemonade.

Time to start working with the String I was given. I knew the best way to do this is with Regex (Regular Expressions) and honestly I had no idea how to start building an expression. Time to Bing-Google it. Bingo! An old Hey, Scripting Guy! blog post helped out. While the post was aimed at Exchange Online, the Regex demonstrated worked great for me, so I “re-purposed” that snip. I also added “Learn Regular Expressions” to my <To Learn List>.

With a little polishing I now had a working basis for my scripting project.

PS C:\> [Math]::Round(((Get-MailboxStatistics -Identity greg).totalitemsize -replace ‘(.*\()|,| [a-z]*\)’, ”)/1MB,2)

642.5

In my next post it’s time to put some toolmaking magic on this puppy!

Once more with feeling…..

OK so I have tried the whole blogging thing before.
Truth be told, my heart wasn’t in it then. I lacked a bit of confidence that I had anything to contribute to the greater voice of the Internet.
Then I realized that it really doesn’t matter. Not like I’m taking up space that someone else could use. Storage is cheap comparatively speaking and the only way to improve is practice. So here we go!