Category Archives: PowerShell

Script for assigning SharePoint Licenses to Office365

Adding SharePoint licenses to Office 365 can be a bit tricky.

If you add the E3 license, you get EVERYTHING that comes with E3, if that’s what you need, great, but what if you ONLY want SharePoint, and not Lync, email, etc??

I ran into this recently and used a few resources to come up with a script.

This article was really helpful:  http://www.powershellmagazine.com/2012/04/23/provisioning-and-licensing-office-365-accounts-with-powershell/

As was some script work by an awesome guy I work with named Chris.

The tricky thing here is you can’t directly grant just a SharePoint license in MSOL E3…

You have to do it subtractively.

Let me explain…..

Say you have 3 letters, A, B & C

You might expect to add a license for b like this:

Add-license -option B

It doesn’t work that way. (At least not in 2015 when I wrote this)

Instead you have to say:

Add-License -disable A C

No problem you say.

“I’ll just add code to disable A C”

That’s great, until….

Microsoft adds Option D

Now, when you try

Add-License -disable A C

You’ve just assigned a B and D license, when you only wanted to assign a B license.

Now you see the issue….

The solution is not too hard – we can pull a list of all options available, remove the one we want, and then build the disable list from that.

This way we won’t get caught when Microsoft springs options EFGHI on us.

The full script is below.

Note: there are some unused functions in the script for setting a password – if you have brand new users to Office 365, they may never have used their identity and will need the password reset and sent to them, if that’s the case, just add the call to Reset-Password -upn $upn at the appropriate place(s)

Here’s the script:

#http://sharepointjack.com
#based on content from http://www.powershellmagazine.com/2012/04/23/provisioning-and-licensing-office-365-accounts-with-powershell/
#use at your own risk, while this has worked for me, you should always test in a test environment and understand what it is the script is doing before using it.



write-host "Don't forget to connect to MSOL first!" -foregroundcolor yellow
Start-transcript 
function main()
{
    #variables used in the program
    $script:MissingUsers = @()
	$script:SMTPServer = "YOUR.SMTP.ADDRESS.COM"
	$importFilename = "_SPOUserList.txt"
	$LicenseType = "TENANTNAME:ENTERPRISEPACK"
	$SubLicense = "SHAREPOINTENTERPRISE"

	$path = $(get-location).path
	$userlist = get-content -path "$path\$importfilename"

	foreach ($upn in $userlist)
	{
	    	$upn = $upn.trim()
		
		#note the continue statement on next line, this skips the rest of the loop if the user is not found.
		if ((Check-UserExists -upn $upn) -ne "YES") {write-host "skipping $upn" -foregroundcolor black -backgroundcolor yellow; continue}
	    if ((CheckUserHasPrimaryLicense -upn $upn -PrimaryLicense $LicenseType) -eq $true)
		{
			#user has E3 license
			Write-host "User $upn has $LicenseType License, adding $SubLicense SubLicense"
			Add-SubLicense -upn $upn -PrimaryLicense $LicenseType -SubLicense $SubLicense
		} else {
		    #user has no license of any kind, but is still provisioned in MSOL
			write-host "User $upn does not have a license for $LicenseType adding now"
			Assign-NewLicense -upn $upn -Location "US" -PrimaryLicense $LicenseType -SubLicense $SubLicense
			
		}	
  #note, if you need to reset the users password and email that to them, add a line such as:
  # Reset-Password -upn $upn
	}	
    
	Report-MissingUsers   #report the names of any missing users so they can be investigated	
}#end main

##---------------------------------------------------------------------
##  Utility Functions from here down
##---------------------------------------------------------------------


##---------------------------------------------------------------------
## this function checks the upn (email address) to see if they exist at all in MSOL
## typically if they don't, there is a misspelling or other problem with the name and we want to report on that...
##---------------------------------------------------------------------
function Check-UserExists ($upn)
{
   $spouser = get-msoluser -user $upn -erroraction silentlycontinue
   if ($spouser -eq $null)
   {
       Write-host "user >$upn< Not found in MSOL" -foregroundcolor DarkRed -backgroundcolor Gray
	   $Script:Missingusers += $upn
       return $false 
   }
   else
   {
       return $true
   }
}

##---------------------------------------------------------------------
## this function checks the upn (email address) to see if it has the passed primary License (For example TENANTNAME:ENTERPRISEPACK)
## it returns true if the user has this license, and false if they do not.
##---------------------------------------------------------------------
function CheckUserHasPrimaryLicense ($upn, $PrimaryLicense) 
{
	$ReturnValue = $false
	$spouser = get-msoluser -user $upn
	$count = $($spouser.Licenses | where {$_.AccountSkuId -EQ $PrimaryLicense}).count
	write-host "Found exactly $count Licenses that matched $PrimaryLicense for user $upn" -foregroundcolor yellow
	if ($count -eq 1) 
	{ $ReturnValue = $true }
	return $ReturnValue
}

##---------------------------------------------------------------------
## this function Adds a given SubLicense (for example SHAREPOINTENTERPRISE) 
##  to a users Pre-Existing License (for example TENANTNAME:ENTERPRISEPACK)
##---------------------------------------------------------------------
function Add-SubLicense($upn, $PrimaryLicense, $SubLicense)
{
	$spouser = get-msoluser -user $upn
	#assemble a list of sub-licenses types the user has that are currently disabled, minus the one we're trying to add 
	$disabledServices = $spouser.Licenses.servicestatus | where {$_.ProvisioningStatus -eq "Disabled"}  | select -expand serviceplan | Select ServiceName | where {$_.ServiceName -ne $SubLicense}
	
	#disabled items need to be in an array form, next 2 lines build that...
	$disabled = @()
	foreach  ($item in $disabledServices.servicename) {$disabled += $item}
	
	write-host "  Adding Sub-license $SubLicense to existing $PrimaryLicense License to user $upn" -foregroundcolor green
	write-host "    Disabled License options: '$Disabled'" -foregroundColor green
	
	$LicenseOptions = New-MsolLicenseOptions -AccountSkuId $PrimaryLicense -DisabledPlans $disabled
	set-msoluserlicense  -userprincipalname $upn -licenseoptions  $LicenseOptions
}

##---------------------------------------------------------------------
## this function Assigns a new Primary License ($PrimaryLicense) and SubLicense to a users MSOL account
##---------------------------------------------------------------------
Function Assign-NewLicense($upn, $Location, $PrimaryLicense, $SubLicense)
{
    #assemble a list of sub-licenses available in the tenant, we want to disable all but our target sublicense
	$disabledServices = get-msolaccountsku | Where {$_.accountSkuID -eq $PrimaryLicense} | Select -expand "ServiceStatus" | select -expand "ServicePlan" | select ServiceName | where {$_.ServiceName -ne $SubLicense}
	
	#disabled items need to be in an array form, next 2 lines build that...
	$disabled = @()
	foreach  ($item in $disabledServices.servicename) {$disabled += $item}
	
	write-host "  Adding Completely new $PrimaryLicense license with $SubLicense sublicense for user $upn " -foregroundColor cyan
	write-host "    Disabled License options: $Disabled" -foregroundColor cyan
	
	$LicenseOptions = New-MsolLicenseOptions -AccountSkuId $PrimaryLicense -DisabledPlans $Disabled
	Set-MsolUser -UserPrincipalName $upn –UsageLocation $Location 
	Set-MsolUserLicense -User $Upn -AddLicenses $PrimaryLicense -LicenseOptions $LicenseOptions

}


##---------------------------------------------------------------------
## This function changes the MSOL users password and
## emails the user the temp password and some basic instructions
##---------------------------------------------------------------------
Function Reset-Password($upn)
{
	#generates a random password, 
	#Changes the MSOL Password,
	#emails the user the temp password and some basic instructions
	
	$tempPassword = Generate-Password
	Set-msolUserPassword -UserPrincipalName $upn -NewPassword $tempPassword
	$to = $upn
	$cc = "adminemail@yourdomain.com"
	$from = "adminemail@yourdomain.com"
	$Subject = "Important: Temporary password for your SharePoint Online Account"
	$body =  "Hello, <br/><br/>    You've just been granted a license for SharePoint Online.<br/><br/>"
	$body += "Your user ID is  <b>$upn</b> and your Temporary Password is <b>$TempPassword</b><br/>"
	
	$body += "Please log on to <a href='http://portal.office.com'>http://portal.office.com</a> <b>right now</b> and change the temporary password above to one you'll remember.<br/><br/>"
	
	write-host "Sending email"
	Send-MailMessage -From $from -to $to -cc $cc -Subject $subject -bodyashtml $body -SmtpServer $script:SMTPServer	
	Write-host "Email sent to $upn with $tempPassword"
}

##---------------------------------------------------------------------
## This function generates a random password
## 17 chars long with guaranteed min of 1 number, 1 lower and 1 upper case 
##---------------------------------------------------------------------
Function generate-Password
{
	$alphabetUpper = $NULL;
	for ($a=65; $a -le 90; $a++)
	{
		$alphabetUpper+=,[char][byte]$a 
	}

	$alphabetlower = $NULL;
	for ($a=97; $a -le 122; $a++)
	{
		$alphabetlower+=,[char][byte]$a 
	}

	$ascii=$NULL;
	For ($a=48;$a -le 122;$a++) 
	{
		$ascii+=,[char][byte]$a
	}

	$Fullset=$null
	For ($a=48;$a -le 57;$a++)  #0-9
	{
		$Fullset+=,[char][byte]$a
	}
	For ($a=65;$a -le 90;$a++) #A-Z
	{
		$Fullset+=,[char][byte]$a
	}
	For ($a=97;$a -le 122;$a++) #a-z
	{
		$Fullset+=,[char][byte]$a
	}

	$onepassword = $null
     #start password with an alphabetical letter.
	 $onepassword += (get-random -InputObject $alphabetlower)
	 $onepassword += (get-random -InputObject $alphabetUpper)
	 #now add a number to guarantee we have a number
	 $onepassword += get-random -Minimum 0 -Maximum 9
	 
	 
     #now add 14 random chars from the combined set.
	 for ($pwlen=0; $pwlen -le 11; $pwlen++)
	 {
       $onepassword += (get-random -inputObject $Fullset)
     }
	 return $onePassword
}

##---------------------------------------------------------------------
## This function displays/emails any missing user infomation
##---------------------------------------------------------------------	 
function Report-MissingUsers()
{
  if ($script:MissingUsers.length -gt 0)
  {
    write-host " -------------------------------------- "
	Write-host "          The following users were not found in SPO:          " -foregroundcolor magenta -backgroundcolor gray
	$script:Missingusers
	Write-host "          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^          " -foregroundcolor magenta -backgroundcolor gray
	
	$to = "adminemail@yourdomain.com"
	$from = "adminemail@yourdomain.com"
	$Subject = "Missing users in last license assignment"
	$body = $script:MissingUsers -join "<br/>"
	
	write-host "Sending email of missing users to $to"
	Send-MailMessage -From $from -to $to -Subject $subject -bodyashtml $body -SmtpServer $script:SMTPServer	
	Write-host "Missing user email sent to $to"
  }
}

main #call main procedure, this line must be the very last line in the file.
Stop-transcript #ok, actually this line must be the very last line...

 

Create a ShareGate User mapping file between on Premise AD and o365 / Azure AD

We use ShareGate to migrate content.

We recently started using ShareGate to migrate content from On Premise to SharePoint Online.

When I did this, I found that one of our domain’s users kept showing up as errors in ShareGate – it said it could not find the user in SharePoint Online.

ShareGate has a nice feature for mapping users from one system to users in another – but doing this manually to any scale would be pretty time consuming.

Thankfully, ShareGate lets us save the mappings, which are just XML files with a .sgum file extension.

Wouldn’t it be great if there was a way to automate creating a mapping file like this for everyone in the domain at once?

Have a look at the script below, it pulls all the user accounts from an OU in AD, then looks up each user to find them in MSOL (Office 365 Azure AD) Then grabs the o365 display name and makes the mapping . Any user not found is logged so it can be dealt with separately.

The whole thing is written out as a complete .sgum file, ready to import into ShareGate the next time you migrate!

Note I didn’t figure out the XML stuff in a vacuum – I found an article on powershellmagazine.com to be very helpful and noted it in the script.

# sharepointjack.com
# use at your own risk

$users = get-aduser -server server.domain.com -filter * -searchbase "OU=Users,DC=server,DC=domain,DC=COM"

$total = $users.count
$count = 0
$badnames = @()


#--------------------------------------
# from http://www.powershellmagazine.com/2013/08/19/mastering-everyday-xml-tasks-in-powershell/
$dt = get-date -format "yyyyMMdd"
$path = "$(get-location)\UserMap_$dt.sgum"
$XmlWriter = new-object System.XML.XMLTextWriter($path, $null)
$XmlWriter.Formatting = 'Indented'
$xmlwriter.Indentation = 1
$XmlWriter.IndentChar = "`t"


#write the header
$xmlWriter.WriteStartDocument()

$XmlWriter.WriteComment("Start of XML")
$XMLWriter.WriteStartElement('UserAndGroupMappings')
$XmlWriter.WriteAttributeString('xmlns:xsd', 'http://www.w3.org/2001/XMLSchema')
$XmlWriter.WriteAttributeString('xmlns:xsi', 'http://www.w3.org/2001/XMLSchema-instance')


$XMLWriter.WriteStartElement("Mappings")
$XmlWriter.WriteComment("Start of main loop")
foreach ($OneUser in $users)
{ 
    $count ++
    $XMLWriter.WriteStartElement("Mapping")
    $SourceAccountName = "DOMAINGOESHERE\$($OneUser.Samaccountname)"
    $SourceDisplayname = $OneUser.name

    $DestinationAccountName = "i:0#f|membership|$($OneUser.UserPrincipalName)"
    #pull the destination user name from MSOnLine
    $DDN = $(get-MSOLuser -userprincipalName $OneUser.UserPrincipalName).Displayname
    #if MSOL not found, length will be zero. in that case use the AD displayname
    if ($DDN.length -eq 0)
    {
       $DestinationDisplayname = $OneUser.name
       $badnames += $OneUser.userprincipalName
       write-host "Warning: $($OneUser.userprincipalName) username Not found in MSOL" -foregroundcolor cyan
    }
    else
    {
       $DestinationDisplayname = $DDN 
       write-host "$count of $total"
    }
 
    $XMLWriter.WriteStartElement("Source")
    $XmlWriter.WriteAttributeString('AccountName', $SourceAccountName)
    $XmlWriter.WriteAttributeString('DisplayName', $SourceDisplayname)
    $XmlWriter.WriteEndElement() #source

    $XMLWriter.WriteStartElement("Destination")
    $XmlWriter.WriteAttributeString('AccountName', $DestinationAccountName)
    $XmlWriter.WriteAttributeString('DisplayName', $DestinationDisplayname)
    $XmlWriter.WriteEndElement() #Destination
    $XmlWriter.WriteEndElement() #mapping
}
$XmlWriter.WriteEndElement() #mappings
$XmlWriter.WriteEndElement() #UserAndGroupMappings


#finalize the document
$xmlWriter.WriteEndDocument()
$xmlWriter.Flush()
$xmlWriter.Close()

$bnpath = "$(get-location)\BadNames_$dt.txt"
$badnames | out-file -filepath $bnpath
notepad $path

– Jack

Add a person as a site collection administrator to every Office 365 Site / SharePoint Online Site Collection

The Problem:

In SharePoint online (at least as of early 2015) site collection administrators have to be granted on a site by site basis.

When you create a new site collection using  https://yoururl-admin.sharepoint.com, you are only allowed to pick ONE administrator for the Site collection (In on premise, you used to pick two)

NewSiteCollectionSPOdialog

Now a little trick you can use is, after the site collection is created, you can check the site collection then click the “owners” tab:

SPOadminBar

and from that screen you can add as many site collection administrators as you’d like:

AddSPOadminDialog

 

But there is a downside, you can’t “select all” on all your site collections and add a user to all site collections at once.

Now, I hear you saying “Jack: What if I have 500 site collections and we add a new member to our team?” There’s got to be a better way, right? And it turns out, there is.

The Solution: PowerShell…

A Quick note before we get to the script: You’ll need the SharePoint Online Management Shell installed on your PC before this will work.
Here’s a quick overview of how to use the script:

Update all the relevant variables:

  1. Admin site URL ($adminurl), and the $username that has permissions to log into the admin site url to make the change.
  2. put in your $tenantURL
  3. Update the list of $SiteCollectionAdmins with the list of users you want to make site collection admins

Run the script.

When you run the script it will try to logon to your SPO account and will prompt you for your SPO password, then you should see some slow and steady progress as it runs through each site collection. Finally, at the end you can review the log file to see if there were any issues.

The Script:

# Jack Fruh - sharepointjack.com
# add a user or users to the site collection admin role on every site collection in Office 365 sites (SharePoint Online)

#setup a log path
$path = "$($(get-location).path)\LogFile.txt"
#note we're using start-transcript, this does not work from inside the powershell ISE, only the command prompt

start-transcript -path $Path
write-host "This will connect to SharePoint Online"

#Admin Variables:
$Adminurl = "https://yoururl-admin.sharepoint.com"
$username = "your@email.com"

#Tenant Variables:
$TenantURL = "https://yoururl.sharepoint.com"

$SiteCollectionAdmins = @("firstuser@yourdomain.com", "seconduser@yourdomain.com", "etc@yourdomain.com")

#Connect to SPO
$SecurePWD = read-host -assecurestring "Enter Password for $username"
$credential = new-object -typename System.Management.Automation.PSCredential -argumentlist $username, $SecurePWD

Connect-SPOService -url $Adminurl -credential $credential
write-host "Connected" -foregroundcolor green


$sites = get-sposite
Foreach ($site in $sites)
{
    Write-host "Adding users to $($site.URL)" -foregroundcolor yellow
	#get the owner group name
	$ownerGroup = get-spoSitegroup -site $site.url | where {$_.title -like "*Owners"}
	$ownertitle = $ownerGroup.title
	Write-host "Owner Group is named > $ownertitle > " -foregroundcolor cyan
	
	#add the Site Collection Admin to the site in the owners group
	foreach ($user in $SiteCollectionAdmins)
	{
		Write-host "Adding $user to $($site.URL) as a user..."
		add-SPOuser  -site $site.url -LoginName $user -group $ownerTitle
		write-host "Done"
		
		#Set the site collection admin flag for the Site collection admin
		write-host "Setting up $user as a site collection admin on $($site.url)..."
		set-spouser -site $site.url -loginname $user -IsSiteCollectionAdmin $true
		write-host "Done"	-foregroundcolor green
	}
}
Write-host "Done with everything" -foregroundcolor green 
stop-transcript

 

Sync an Active Directory Group with a SharePoint Group

Have you ever wanted to keep the members of a SharePoint group in sync with those of an Active Directory Group?

If so, you’re in luck, I happen to have just such as script.

Just a quick note, this was written for and tested on a 2010 site – in 2013, the default authentication is Claims – those funny looking strings like this: “i:0#.w|domain\user” (more info here) and you’ll need to work this script over a few times to make that work.

That said, this script will keep a SharePoint group in sync with an AD group.

The AD group is considered the “master”

That is to say, if the AD group has extra users that aren’t in SharePoint, they will be added to SharePoint.

If a user is removed from the AD group, they will also be removed from the SharePoint Group.

If a user is added to the SP Group, but isn’t in the AD group? They will be removed from the SP group.

This is one of those scripts that makes sense to run as a scheduled task once you get it working.  If you need help with that, see my post: Schedule your PowerShell scripts using the Windows task scheduler.

#AD to SharePoint Sync Script
#Jack Fruh 2014 Sharepointjack.com
write-host "NOTE: This has only been tested on NTLM authentication - it will likely need work to work with claims based authentication in 2013" -foregroundcolor red

function main()
{   
    #setup the environment so this can run as a schedule task:
    #note that the AD PowerShell Commands must be installed on the SharePoint Server for this to work.
    Write-host "note that the AD PowerShell Commands must be installed on the SharePoint Server for this to work." -foregroundcolor blue
    Write-host "Adding SharePoint Snapin"
    $Host.Runspace.ThreadOptions = "ReuseThread"
    Add-PSSnapin microsoft.sharepoint.powershell -ErrorAction SilentlyContinue
    Write-host "Importing AD module"
    import-module activedirectory

    write-host "Starting work..."
    #Hard coded variables for testing, 
    # we need the name of the AD Group, the name of the corresponding group in Sharepoint to sync with, and the URL of the SPWeb where the SP group resides.
    $ADgroupname = "TheGroup"
    $SPGroupName = "SyncMeWithAD"
    $spweburl = "http://sharepoint2010"
    #note that it's reasonably easy to turn this hardcoded list into a CSV import and then loop through multiple groups

    #get a list of the AD Users in the AD Group
    #$ADGroupMembers = get-adgroupmember -Identity $ADgroupname | select @{name="LoginName";expression={$_.samaccountname}}
    $ADGroupMembers = get-adgroupmember -Identity $ADgroupname | select @{name="LoginName";expression={$(getDomain($_.distinguishedName)) + "\" +  $_.samaccountname.toupper()}}
    if ($ADGroupMembers -eq $null)
    {
        write-host "The AD Group we're syncing with is empty - this is usually a problem or typo - the SP group will be left alone" -foregroundcolor red
        exit
    }


    #get the list of users in the SharePoint Group
    $web = get-spweb $spweburl
    $group = $web.groups[$SPGroupName]
    if ($group -eq $null) {write-host "SPGroup Not found" ; exit }
    $spusers = $group.users | select @{name="LoginName";expression={$_.LoginName.toupper()}}
  
    write-host "Debug: at this point we should have a list of user ID's from SharePoint in domain\user format, uppercase" 
    foreach($x in $spusers)
    {
        write-host $x.LoginName -foregroundcolor green
    }
  
    if($spusers -eq $null)
    {
      write-host "The SPgroup is empty" -foregroundcolor cyan
      write-host "Adding all AD group members to the SP group"
      foreach ($ADGroupMember in $ADGroupMembers)
      {
            #add the AD group member to the SP group
            write-host "Adding $($ADGroupMember.LoginName)" 
            write-host "new-spuser -useralias $($ADGroupMember.LoginName) -web $($web.url) -group $SPGroupName" -foregroundcolor green
            new-spuser -useralias $ADGroupMember.LoginName -web $web.url -group $SPGroupName
           # $web.site.rooteweb.ensureUser($ADGroupMember.loginname)
            set-SPuser -identity $ADGroupMember.LoginName -web $web.url -group $spgroupname
            
      }
      write-host "Done adding users - script will now exit" -foregroundcolor magenta
      exit
    }

    #use compare-object to get a listing of what's different between AD and SP
    write-host "Comparing AD Group Users to SP group Users"
    $result = compare-object -referenceobject $adgroupmembers -differenceobject $spusers  -includeequal -property LoginName

    Write-host "Result of comparison is:"
    $result
    write-host "-------------------------"
    
  
    

    #filter the results of the comparison to show only the users that need to be added
    Write-host "Looking for users in AD that we need to add to SharePoint"
    $missingSPusers = $result |  Where {$_.SideIndicator -eq '<='} | select LoginName #users in AD that are missing from SharePoint
    if ($missingSPusers -ne $Null)
    {
        foreach ($missingSPuser in $missingSPusers)
        {
            write-host "Adding $($missingSPUser.LoginName) to sharepoint"
            write-host "new-spuser -useralias $($missingSPuser.LoginName) -web $($web.url) -group $SPGroupName" -foregroundcolor green
            new-spuser -useralias $missingSPuser.LoginName -web $web.url -group $SPGroupName
            set-SPuser -identity $missingSPuser.LoginName -web $web.url -group $spgroupname
        }
    }
    
    #now do the reverse
    #Filter the results of the comparison to show 'extra' users that need to be removed
    write-host "Looking for Extra users in SharePoint that are not in AD"
    $extraSPusers = $result |  Where {$_.SideIndicator -eq '=>'} | select LoginName #users in AD that are missing from SharePoint
    if ($extraSPusers -ne $null)
    {
        foreach ($extraSPuser in $extraSPusers)
        {
            write-host "Removing $($extraSPuser.LoginName) to sharepoint"
            Remove-SPUser -useralias $($extraSPuser.LoginName) -Web $web -group $SPGroupName -confirm:$false
        }
    }
} # end main function

function getDomain($distinguishedname)
{
  $dn = $distinguishedname
  #extract Domain name from Distinguished name
  #Domain name should be the first DN= Entry from left to right.
  #Examples:  CN=BOB,OU=Users,DN=DomainShortName,DN=Com
  #Examples:  CN=BOB,OU=Users,DN=DomainShortName,DN=Forrest,DN=Com
  $start = $dn.indexof(",DC=")+4 #find the first occurance of DC in the Distinguishedname  
  $end = $dn.indexof(",",$start)  #find the first comma, this is the end of the domain name
  $domain = $dn.substring($start, $end-$start).toupper()
  return $domain
}

main

Now for a friendly reminder and some advice…
#1) Always test code you find online before using it in production…
#2) when you test this code, follow this advice:

Testing this code

When you test the code, you might make a mistake I made during development – I’ll share that mistake with you to save you an hour of time and some frustration.

Here’s what I did…

While testing, I wanted to try adding users to an AD group and wanted to make sure they added in correctly.

For one test I wanted to remove ALL the users from the SharePoint Group, and confirm that they came back ok.

To do this I used the UI to remove all the users – I checked each user, then clicked “actions->remove users from group” like this:
removeusersscreenSP2010

I then ran my Super Awesome AD Sync PowerShell Script which Added the groups back in.

Now here’s where it got ugly.

When I checked the UI, they weren’t there.

In fact, if I ran the powershell script again it indicated that they were being added back a second time (the script should have told me there was nothing to change!)

What was the cause?

It was my use of the refresh button…refresh in IE

Recall that the very last thing I did was remove users using that screen.

Now interestingly, you know how we all click “OK” on a screen without paying attention?

After I hit refresh, I got this, and ignored it:
dialogIshouldhaveread

See what I did there?

I was refreshing the delete in the UI!

Don’t make that mistake!

Instead of clicking the refresh button, it’s easier (and safer) to click the group name on the left:
dontclickrefreshwhenyoudelete

Lessons learned:

  • Pay attention to dialog boxes, they may save you an hour.
  • Don’t ever click ‘refresh’ after performing a delete!

 

I’m presenting at SPFest Chicago – December 8th-10th 2014

I’m excited to be a part of SP Fest Chicago 2014 this year with two sessions:

“SharePoint PowerShell Time Machine”
On Tuesday December 9th at 11:20am.

In this session I share scripts that will either save you time, or in some cases, allow you to go back in time. For example, there’s a script that enables versioning on every document library in your farm. Another that logs basic permissions, and as many other handy scripts as I can fit into a 70 minute session!

“Advanced Introduction to PowerShell Scripting”
On Monday December 8th 8:30am-Noon with co-host Michael Blumenthal

This 100-200 level half day workshop will introduce you to PowerShell in a way that you can use the very next day at work. We’ll quickly cover the basics of PowerShell, then demonstrate some real world scripts used to solve day to day problems in the life of a SharePoint Administrator. We then deconstruct a script- and show you “how we figured it out” – showing the process of starting a script from scratch, and the explorative process used to figure out what to put in a script when you have no idea how at the beginning (and without using Google!) Along the way we explain and demonstrate the features of PowerShell that make things fun! There is something for everyone – new PowerShell users will be on the fast-track to using PS in day to day work. more experienced PS users benefit from seeing real-world scripts and hearing how others go about solving problems. We even have a few tricks that will save time for developers! 

I hope to see you there!

– Jack

SharePoint ULS logs Tips, Tricks, and Thoughts

ULS logs are where SharePoint logs what’s happening, they are plain text files.

Here are a few thoughts on optimizing your workflow when dealing with the ULS logs.

#1 – Know where your logs are.

In Central administration, under Monitoring->Diagnostic Logging, you can set the path for the ULS logs – this sets the path for EVERY server in your farm, so you’ll need to ensure the drive letter exists and has space on EACH server. In general, you don’t want these on the C: Drive, which unfortunately, is the default. I put them in a folder called d:\Sharepoint_Logs

#2 – Don’t log more than you need to!

ULS log files can be HUGE – on the Diagnostic Logging screen (Central admin->Monitoring->Diagnostic Logging) consider doing the following:

  • Under “Event throttling” select “All Categories” then pull the drop downs in that section to “Reset to Default”
  • Under the “Trace Log” area, set the number of days to something reasonable – Ie decide how far back you’d be willing to go to look at a problem, in many cases, 7 days should be sufficient.
  • Under the “Trace Log” area, Restrict Trace Log Disk Space – My Log drive is 80GB, but it’s also shared with a few other logs like the usage logs, and some CSV’s I generate with PowerShell – so I limit ULS logs to 40GB

#3 – Use the ULS Log Viewer.

There have been several of these types of viewers made, but there is “the one” that everyone always uses and that’s available here: http://www.toddklindt.com/ulsviewer

This used to be on Codeplex, but sadly, it wasn’t updated and Codeplex auto-deleted it. Todd was gracious enough to put a copy on his site so others can continue to use this great tool.

Personally, I keep a copy of the ULS Log Viewer right in the folder with the ULS logs- that way it’s nice and handy.

What the ULS logviewer does, is parse the log file and display it in a very user friendly way – I used to use Notepad++ to look at the logs and this is much much better for logviewing.  It has the option of opening a log file, or of opening the ULS logs in real time. One nice thing about it, is it seems to “Know” where your ULS logs are, so you can open the current one without having to go through a bunch of mouse clicks. It’s also great at letting you filter the logs, for example, seeing only the critical events.

#4 – PowerShell related ULS commands

There are a few PowerShell commands for the ULS logs that can be handy to know about. This MSDN page talks about all of them, below you’ll find a few that I’ve personally used with a longer explanation than whats on MSDN.

Merge-SPLogFile

What this one does, is combines all the ULS log file entries from all the servers in your farm, and merges them down to a single file on the box where the command was run – very helpful for tracing issues between boxes on your farm.

There are a lot of ways to use the command so try get-help Merge-SPLogfile -detailed to get the finer details (In particular, note the -starttime and -endtime parameters so you are only merging managable amounts of data!)

Also a quick search found this great article by Cecildt with screenshots and lots more detail than I have here: http://cecildt.blogspot.com/2012/10/sharepoint-2010using-merge-splogfile-to.html

New-SPLogFile

This command Closes the current ULS log file and starts a new one. To my knowledge it only does this on the server where you’ve run the command (ie if you have 3 servers, you’ll need to run on each one)  I’ve used this before when looking at a system in real time with a support person from MS- We ran this command to reset the log file, then recreated a problem, then ran the command again – and what we were left with was a very nice small ULS log that was reasonably timeboxed around the event we were investigating. We could have done the same with Merge-SPlogfile by setting times, but that command is slower, and we would have needed to jot down the start and end times.  In other words, New-SPlogfiles doesn’t do anything you couldn’t do with other tools, it just makes it easier under the right circumstances.

– Jack

Active Directory Migration Woes (Part 2)

In Part 1, I talked about a SP2013 farm that wasn’t behaving as a result of a lengthy in-progress AD migration, and linked to an article that talked about a hack, and then ultimately a solution via a SharePoint setting if your farm was patched to a patch level sometime around October 2012 or later. On the 2013 farm that resolved the issue and all were happy.

In Part 2, We visit the same issue on a SharePoint 2007 farm, and use the articles “hack” to resolve the issue in the 2007 environment.
For reference, the article who’s approach was used is here:
http://blogs.technet.com/b/craigf/archive/2012/10/15/sharepoint-and-sid-history-not-playing-well-together.aspx

The 2007 farm, unlike the 2013 farm doesn’t have the option of using the SharePoint setting – it doesn’t exist in the version of SP2007 in use, and a shotgun upgrade of the SP2007 farm is not an option.

Time to re-visit the “Hack” portion of the above article…

The Problem:
Two domains, Users in Domain A, SharePoint in Domain B
The users are going to be migrated one day, from A to B.
The AD team has replicated all the users so that they now exist in BOTH domain A and B.
Along with this replication, an attribute called “SID History” was set in domain B for each user.

Because of this, here’s how things work in Domain B…
If a program (any program, not just SharePoint) asks Domain B to fetch a user ID from a SID, Domain B will look in it’s list of users/Sids.
In Ordinary times, when users existed in only one place, the domain B controller would try itself first, and not finding the ID, would then send off the request to Domain A.

In our case however, the ID’s exist in both places.

So an app requests the User ID from a given SID from the DC in Domain B. This time, Domain B finds that SID in SID history. “AH HA” the domain says to itself. “I don’t have this ID, but I can see right here, that ID was migrated to an ID I DO have. I’ll just give you the DomainB\Account” (Remember we want DomainA\Account)

While that would be very helpful if we were done with our migration, and really really helpful if the Domain A account was deleted and gone, in our case we had a little problem.

The users were still using accounts from Domain A, so we needed those to resolve.

In the article, Craig Forster figured out that by using the right tool to query AD, you could pre-populate the LSA cache with the “desired” user accounts from the right domain controller, eliminating the whole name lookup and SID history piece of the puzzle.

Craig’s article mentioned the approach he used, but not the scripts.

That’s where this blog post comes in…
I crafted a solution to do what Craig’s article described.

First things first, the prerequisites:

  • PSGetSid.exe you can get this from Sysinternals
  • You’ll need PowerShell on the WFE’s for your 2007 Farm (regular PowerShell, we’re not touching SharePoint so there’s no need to panic that SP2007 doesn’t support SharePoint)
  • You’ll need the AD PowerShell module – this is a windows server
    “feature”, found under “remote server tools” when you add a feature.

Ok now for an overview of the solution:
From Craig’s article, we know we need to query not just the domain, but we need to look for every specific user.
To make this practical, we want to query AD for the list of users, then use the PSGetSid.exe program to query for that user.

start-transcript D:\adhack\RunLog.txt
import-module ActiveDirectory

$PSGETSID = "D:\ADHACK\PSgetSid.exe"
$batchFileName = "D:\ADHACK\FixMyAD.bat"

# this is to start the file clean so it's not appending between executions.
"Rem Start of file" | out-file -filepath $BatchFilename -Encoding "ASCII"

#Note, there is a good chance the next two registry commands will fail, for example, if they already exist - these are easy enough to set manually so thats what you should do, but leave theme here for reference

#set the timeout - this should be a number greater than the time it takes to run this script, for example, if the script takes 15 min to run and you schedule it to run every 20 min, then you'd want this to be something like 40 minutes so if it fails once, you'd still have the values in cache.
new-itemproperty -path HKLM:\SYSTEM\CurrentControlSet\Control\LSA\ -name LsaLookupCacheRefreshTime -PropertyType dword -value 40 #time in Minutes

#set the LSA cache (Default of 128 is too small)
new-itemproperty -path HKLM:\SYSTEM\CurrentControlSet\Control\LSA\ -name LsaLookupCacheMaxSize -PropertyType dword -value 131072 #this is the decimal value

#remember, you're running this from a WFE that's joined to the DESTINATION Domain

#first OU Example - query an OU in DESTDOMAIN (This would imply you have read rights to that OU from the WFE) 
$users = Get-ADUser -filter * -searchbase "OU=UserList,DC=DESTDOMAIN,DC=MYCOMPANY,DC=COM" | Select SamAccountName | Sort SamAccountName
foreach ($user in $users)
{
   #Here, note we're getting the user ID in TARGETDOMAIN, but issuing the command to SOURCEDOMAIN
   $act = "SOURCEDOMAIN\$($user.Samaccountname)"
  write-host "Attempting to cache $act" -foregroundcolor yellow
   & .\PSGetSid.exe $act   
   $line = "$PSGETSID $act"
   $line | out-file -filepath $BatchFileName -Encoding "ASCII" -Append 
   write-host "Done with $act" -foregroundcolor blue  
}


#Second OU EXAMPLE - querying the SOURCEDOMAIN when you've restricted access to the OU on the TARGETDOMAIN

#Need to specify the -server value name to query the SOURCEDOMAIN domain at server.

#Here what we're doing is going back to the original domain where the active user accounts exist - by active, I mean these are the ones they are logging in with each day.
#Doing this is similar, but note we need to specify the -Server parameter, and oddly, you don't actually specify a server name there, you specify the name of the Source Domain
#Also, as I wrote this, it occurred to me that it's quite possible that this query alone is doing the same thing as PSgetSid.exe, so maybe thats not needed in this case (Where the SOURCEDOMAIN is being queried) I'll have to test it one day...

$users = Get-ADUser -filter * -searchbase "OU=UserList,DC=SOURCEDOMAIN,DC=OLDCOMPANY,DC=COM" -Server SOURCEDOMAIN.OLDCOMPANY.COM | Select SamAccountName | Sort SamAccountName
foreach ($user in $users)
{
   $act = "SOURCEDOMAIN\$($user.Samaccountname)"
   write-host "Attempting to cache $act" -foregroundcolor yellow
   & .\PSGetSid.exe $act   
   $line = "$PSGETSID $act"
   $line | out-file -filepath $BatchFileName -Encoding "ASCII" -Append 
   write-host "Done with $act" -foregroundcolor blue  
}

write-host "Done $($users.count) users attempted to cache" -foregroundcolor green
stop-transcript

Ok so that’s the script – put it in the same folder as the PSGetSid.exe file

Now you might be wondering, what’s that FixMyAD.bat file it’s creating?
Well, it’s not really needed, but here’s the thought behind it – if you look at what it’s doing, FixMyAd.bat is just the PSGetSid.exe command repeated a bunch of times for each user in your environment. For a while, I was having trouble getting PSgetSid.exe to run when shelled out from Powershell, so I added that code with the thought that PS could generate the command and then a different job could run the batch file – it turned out to not be necessary, but I left it in there – it might be handy for some edge cases.

Normally, I’d schedule the PowerShell to run per the video Here: http://youtu.be/oJ4nktysxnE however, in this domain, it wasn’t having any of that – the security policy would not run unsigned scripts – I tried to change it, but that registry key was locked down. Darn!

Luckily I had another trick at my disposal: A small batch file launcher to launch PowerShell with the security disabled:

This is a non-powershell batch file that launches Powershell with an executionpolicy of bypass, then calls the PS1 file with the script and runs it.

pslauncher.bat:

powershell -Command "& {Set-ExecutionPolicy -scope currentuser -executionpolicy bypass}" -NoExit
powershell -Command "& {d:\ADHack\PreCache.ps1}" -NoExit

Note that I also had to set the scheduled task to “Run with highest privileges”

So thats the solution I ended up using to rig the LSA cache so it would display the correct user ID’s
Oh one more note- if you want to test if this is working, you might want to reset the LSA cache – as best I could tell, there is no way to do this, but I think you can set the two registry keys to zero, do a test to see what an uncached response is, then set the registry back and test again. No reboot was needed (the keys are documented at the top of the PowerShell script.)

Also, please accept my apologies if this whole thing doesn’t make sense etc… Craig did a great job explaining in his original article, so I didn’t want to rehash all that here, but did want to capture the solution I used since his didn’t show that.

Cool scripts for documenting your farm.

Update:
When we ran this script on a new environment, it was pretty cool, but in production, it generated some unreasonably large files.

Reader Amy also posted her experiences in the comments below and had a 3GB file!

So this will take some more work – looks like we’ll need to identify what is important, what kind of changed we’d want to track and trim down the script as a result.

Original article:


My awesome co-worker Stevan found this link today and it is such a great find that I wanted to put it here to share with all, plus it helps me find it later!

Basically it’s a PowerShell script that catalogs everything in your SharePoint farm and exports to XML Files

This link is to the 2013 version, but there is a link to the 2010 version on the same page:

http://technet.microsoft.com/en-us/library/ff645391(v=office.15).aspx

My intention is to modify the script so that the output gets placed into folders named Year-Month-Date that way I can create a scheduled task in windows, and run this on a regular basis. Then it’ll be a simple matter of using a tool like Windiff to compare files from one folder to another and I should know quickly what changed.

When I get to that point, I’ll paste the updated code below.

– Jack

 

Fun configuring Office web apps 2013 (OWA)

Note: This is not a “how to” article on what steps you need to take to configure OWA in 2013, it is a listing of the problems I ran into while configuring mine, and what solutions I found.  If you’re installing OWA, you’ll definitely want to read this, as it will probably help you, but it’s not a step by step how to guide on how to do it…

I ran into a few problems while configuring OWA for use in a new SharePoint 2013 farm today.

The DNS name you pick matters

I wasn’t sure of the architecture of this whole thing and one question was bugging me…

Does the user need direct access to the OWA server? Or do the SharePoint Web Front ends act as a proxy for it?

I setup a config and ran Fiddler on the client with the browser to see what server(s) it connected to – sure enough it connects to the OWA server(s) directly.

This has some implications for us if we’re making this available on the internet…
Think of this for a moment, in 2010, OWA was a “Service Application” – if your user had access to the WFE, SharePoint took care of the rest, even if OWA was on a different box (or boxes) It was Magic.

With OWA2013, you’ll need an external IP address, an SSL certificate your browsers will see as valid, and NLB or an external load balancer if you’re using more than one OWA box.
It also means you’ll want to build your OWA farm with a legit DNS name and not the machine name.

Uninstalling an old config:

First, I had messed around with using HTTP before my certificates were ready.
No Problem, I thought, I’ll just remove that Initial Farm config I had built with New-OfficeWebAppsFarm command, surely there is a Remove-OfficeWebAppsFarm command right?

Wrong.

get-command *office*

The above shows me 3 commands for creating, but only 2 for removing!

Yep, I can use New-OfficeWebAppsFarm, but there is no Remove-OfficeWebAppsFarm!

It turned out to be easy enough, on my one box farm in Dev, I just needed to use

Remove-OfficeWebAppsMachine

Note that if you have more than one machine in the OWA farm, you’d need to run the above command on each box, saving the first one (master) until last.

With that cleared up and was ready to install with my cert.

The next thing I ran into – the new-OfficeWebAppsFarm command wants the “friendly name” of the cert, not the domain name – it was easy enough to figure that out, but it threw me for a loop.

Time for my first OWA farm install!

Here I ran the command below on the first node of OWA:

New-OfficeWebAppsFarm -internalURl https://myurl.com -editingenabled -certificate "Mycertsfriendlyname"

Everything seemed pretty great…

Confirming it works

Except when you’re done, you’re supposed to check that it works by visiting this URL:

https://myurl.com/hosting/discovery

This is supposed to display a nice little snippit of XML – one that looks familiar if you’ve ever pulled up a web service in a browser.

Instead I got an error.

HTTP Error 500.21 – Internal Server Error
Handler “DiscoveryService” has a bad module “ManagedPipeLineHandler” in its module list

It looked like asp.net wasn’t registered.

I searched the error, and found this article: http://consulting.risualblogs.com/blog/2013/04/03/office-webapps-deployment-error/

bottom line, I ran this command:

c:\windows\Microsoft.net\framework\v4.0.xxxx\aspnet_regiis.exe -ir

and that re-bound ASP.net to IIS.

So far so good, the page was coming up now on the first node.

Adding a second OWA server

Now time for the second node.

To add a 2nd-99th OWA node, the command is a little different:

New-OfficeWebAppsMachine -MachineToJoin myowaurl.com

Here, I ran into one more issue – kinda simple but an “of course!” kinda moment.

When I ran the command the first time, it couldn’t find the OWA server to connect to.

Care to guess why?

Because these are all behind a load balancer, I had host entries on both OWA servers that pointed back to themselves. This is common so that you can log on to one server and confirm THAT server is working.  Well, I got ahead of myself here, and the new-officewebappsmachine command was being redirected back to itself.

Not-going-to-work

Easy fix, I pointed the host entry to the other server long enough to do the command, then set it back so I could test it locally.  I’m glad I did that, because the second node had the same problem as the first, something I might not have found without the host entry.
I ran that aspnet_regiis.exe -ir command a second time, and did an IIS reset and things were looking good on my second OWA node.

Oh and one more goofy thing I noticed…

Configuring SP to talk to OWA

No OWA build is complete until it’s registered with SharePoint 2013 so SP knows where to send the traffic.

To do this, on the SharePoint server we run this command:

New-SPWOPIBinding -Servername urltoyourserver.com

Now what’s odd here, is it rejected my server name multiple times.

It did not like:

  • “Https://myurl.com/”
  • https://myurl.com
  • “Myurl.com” (I don’t know if I actually tried this one)

What worked was just using myurl.com – no quotes, no https://

As soon as I finished the stuff above, I thought I was ‘done’ after all, the commands worked so OWA must work now right?

Kinda.

Don’t test OWA with the System account

I ran into a few more issues when I tried to test this. Remember that bit above, where I said on each OWA server, I had put in a host entry so that I could log on to the server, and test just that server? Well, I was logging on to the server as the SharePoint “System Account” – and guess what? That doesn’t work.  Now you might expect a nice error that says something like “You can’t use this as the system account” – Nope. Just a nice correlation ID to go hunt down.  And in the ULS logs what do we see? Well, nothing that indicates that I can’t use the admin – I dug it up the solution by searching the web.
So I tested with a different account which solved one problem, but it still wasn’t working…

Make sure the individual WOPI bindings match the overall zone

You configure the connection between SharePoint and OWA on the SP box by using commands like this:

New-SPWOPIBinding –ServerName dnsnameof.yourserver.com
write-host "Setting Zone"
Set-SPWOPIZone -zone "External-HTTPS"

Now, you might ask, how did I choose the zone?  There are 4 choices, and somewhere I read that if this site is going to be available internally and externally, use “External-HTTPS”

Not so fast..

As it turns out there’s another command that’s relevant here:

Get-SPWOPIBinding

This will list out a bunch of individual bindings for each file type.
Guess what those bindings said? Internal-HTTPS.

Another search on the internet and yeah, these should have been the same as the overall zone set by the Set-SPWOPIZONE command I used above.

Wanting a “quick win” it seemed faster to change the overall zone by reissuing the set-SPWOPIZone command than to figure out how to update 185 bindings. I ran this:

Set-SPWOPIZONE -zone "internal-https"

And guess what??

OWA works now.

Ok, so that sums up my first few installs of Office Web Apps 2013 and tying them to SharePoint 2013 in a multi-server OWA farm.

That’s a lot of material, lets see if i can sum this up with a nice bulleted list:

  • Use a real DNS name for your OWA box, not the machine name.
  • It’s possible to remove an OWA farm config, even though there is not a “remove-OfficeWebAppsFarm.
  • Don’t put host entries on your OWA boxes until after you’re done with the install.
  • When configuring the WOPI bindings on the SharePoint farm, don’t use https:// in the DNS name.
  • make sure the zone listed for each binding matches the zone set for the the overall connection.
  • When you test OWA, don’t use the Farm account.

– Jack