Yearly Archives: 2014

Using WAP to publish SharePoint on the internet.

Sam Hassani over at Brightstarr put together a couple of great articles on Exposing SharePoint over the internet through Web Access proxy (WAP) , a feature of Windows Server 2012R2.

These articles talk about WAP as a nice alternative to UAG and TMG (two security products Microsoft used to sell, but has since retired)  I haven’t followed the steps yet, but it looks like it might be a suitable fill in for CA Siteminder in some cases as well.

Part 1: http://www.brightstarr.com/sharepoint-technology-and-application-insights/securely-publishing-sharepoint-externally-using-web-application-proxy

Part 2: http://www.brightstarr.com/sharepoint-technology-and-application-insights/securely-publishing-sharepoint-externally-using-web-application-proxy-part-2

The best part of the above articles, in my opinion, is that they really make this stuff look easy and accessible – I can’t wait to try this out!

New Article: SSL certificates and SharePoint

This article talks about several different options for using SSL certificates in SharePoint – it should be very helpful if you’re new to SSL or if you’re hitting one of the border cases such as multi-named Certificates.

  • Single SharePoint Server, single URL + SSL
  • Multiple Web Front end SharePoint servers, sharing one URL using one SSL certificate
  • Single/Multiple SharePoint Server(s), with multiple URL’s each using different SSL certificates on different IP Addresses
  • Single/Multiple SharePoint Server(s), with multiple URL’s each using the same wildcard SSL certificate on the same IP address
  • Single/Multiple SharePoint Server(s), with multiple URL’s each using the same Mulitname (also known as Subject Alternative Name) SSL certificate on the same IP address

http://sharepointjack.com/ssl-certificates-and-sharepoint/

SharePoint ULS logs Tips, Tricks, and Thoughts

ULS logs are where SharePoint logs what’s happening, they are plain text files.

Here are a few thoughts on optimizing your workflow when dealing with the ULS logs.

#1 – Know where your logs are.

In Central administration, under Monitoring->Diagnostic Logging, you can set the path for the ULS logs – this sets the path for EVERY server in your farm, so you’ll need to ensure the drive letter exists and has space on EACH server. In general, you don’t want these on the C: Drive, which unfortunately, is the default. I put them in a folder called d:\Sharepoint_Logs

#2 – Don’t log more than you need to!

ULS log files can be HUGE – on the Diagnostic Logging screen (Central admin->Monitoring->Diagnostic Logging) consider doing the following:

  • Under “Event throttling” select “All Categories” then pull the drop downs in that section to “Reset to Default”
  • Under the “Trace Log” area, set the number of days to something reasonable – Ie decide how far back you’d be willing to go to look at a problem, in many cases, 7 days should be sufficient.
  • Under the “Trace Log” area, Restrict Trace Log Disk Space – My Log drive is 80GB, but it’s also shared with a few other logs like the usage logs, and some CSV’s I generate with PowerShell – so I limit ULS logs to 40GB

#3 – Use the ULS Log Viewer.

There have been several of these types of viewers made, but there is “the one” that everyone always uses and that’s available here: http://www.toddklindt.com/ulsviewer

This used to be on Codeplex, but sadly, it wasn’t updated and Codeplex auto-deleted it. Todd was gracious enough to put a copy on his site so others can continue to use this great tool.

Personally, I keep a copy of the ULS Log Viewer right in the folder with the ULS logs- that way it’s nice and handy.

What the ULS logviewer does, is parse the log file and display it in a very user friendly way – I used to use Notepad++ to look at the logs and this is much much better for logviewing.  It has the option of opening a log file, or of opening the ULS logs in real time. One nice thing about it, is it seems to “Know” where your ULS logs are, so you can open the current one without having to go through a bunch of mouse clicks. It’s also great at letting you filter the logs, for example, seeing only the critical events.

#4 – PowerShell related ULS commands

There are a few PowerShell commands for the ULS logs that can be handy to know about. This MSDN page talks about all of them, below you’ll find a few that I’ve personally used with a longer explanation than whats on MSDN.

Merge-SPLogFile

What this one does, is combines all the ULS log file entries from all the servers in your farm, and merges them down to a single file on the box where the command was run – very helpful for tracing issues between boxes on your farm.

There are a lot of ways to use the command so try get-help Merge-SPLogfile -detailed to get the finer details (In particular, note the -starttime and -endtime parameters so you are only merging managable amounts of data!)

Also a quick search found this great article by Cecildt with screenshots and lots more detail than I have here: http://cecildt.blogspot.com/2012/10/sharepoint-2010using-merge-splogfile-to.html

New-SPLogFile

This command Closes the current ULS log file and starts a new one. To my knowledge it only does this on the server where you’ve run the command (ie if you have 3 servers, you’ll need to run on each one)  I’ve used this before when looking at a system in real time with a support person from MS- We ran this command to reset the log file, then recreated a problem, then ran the command again – and what we were left with was a very nice small ULS log that was reasonably timeboxed around the event we were investigating. We could have done the same with Merge-SPlogfile by setting times, but that command is slower, and we would have needed to jot down the start and end times.  In other words, New-SPlogfiles doesn’t do anything you couldn’t do with other tools, it just makes it easier under the right circumstances.

– Jack

Deployinator

At work I typically deploy  WSP’s to our environments.

A situation came up where it was necessary to redeploy a WSP multiple times, maybe dozens, as a developer worked through an issue.

We needed a fast way to allow the developer to deploy the WSP’s, but with a catch – we couldn’t give him RDP access to the server, and I knew that meetings and such would prevent me from turning around the requests as quickly as needed.

The solution:

Deployinator

I configured this script to run as a scheduled task in Windows, running every 2 minutes.
(For help on scheduling a task, see Schedule your PowerShell Scripts using the Windows Task Scheduler and it’s accompanying 2 min video)

Notes are below the script so be sure to scroll down…

#purpose - check a directory for a file
#if found, update the solution, then move the file to a backup directory
#Jack Fruh 
Add-PSSnapin "Microsoft.SharePoint.Powershell" -erroraction Silentlycontinue
$dt = get-date -format "yyyyMMdd_hhmmtt"

$loc = get-location
$monitorfolder = "WSPDropFolder"
$archivefolder = "Archive"
$wsp = "Name.of.Your.wsp"
$source = "$($loc.path)\$monitorfolder\$wsp"
$dest = "$($loc.path)\$archivefolder\$($dt)_$wsp"

if (test-path $source)
{
    # allow enough time 
    # in case the file was being copied 
    # at the exact moment the script ran:
    sleep 10
    $dt = get-date -format "yyyyMMdd_hhmmtt"
    write-host "file $source found - deploying"
    Update-SPSolution -Identity $wsp -LiteralPath $source -GacDeployment
	  
    move-item $source $dest
	  
    $EmailFrom = "Deployinator@yourdomain.com"
    $EmailTo = "developer@yourdomain.com"
    $EmailCC = "you@yourdomain.com"
    $EmailSubject = "Code moved to Stage"
    $EmailBody = "Hi There, at $dt $source was updated in stage, and moved to $dest"
    $SMTPServer = "smtp.yourdomain.com"

    Send-MailMessage -From $EmailFrom -To $EmailTo -CC $EmailCC -Subject $EmailSubject -body $EmailBody -SmtpServer $SMTPServer	  
}
    else
{
    write-host "No File to deploy found at $source"
}

The WSPDropFolder is shared (ie a windows shared folder on the server) with the developer – the developer is the only one who has access so that should help on the security side of things.

The Script checks to see if the file is there, it’s looking for a specific file, by name – this helps ensure the script is only used to deploy the agreed upon solution and not used for general deployments.

After the Script updates the WSP in sharepoint, it moves it to the Archive folder and date stamps it.

It then finishes things up by sending the developer, and myself an email so we both know the deployment completed.

“Using” the script is simple -the developer copies the WSP for deployment to the WSPDropFolder, then, within 2 minutes the script picks up the file and deploys it, the developer will see the folder is again empty, and will receive an email confirmation.

If the developer needs to redeploy, he just puts a new file in the WSPDropFolder and the process repeats.

It’s worked out great so far, allowing the developer to make and test multiple revisions to his code quickly, and without my intervention.

– Jack

I’m presenting and Hosting SharePoint Saturday Chicago Suburbs on May 17th

If you live in the Chicago Area, Sign up for SPSChicagoSuburbs (http://www.spschicagosuburbs.com)

I’m on the planning team and we’ve got a great event planned, with tons of great sessions and speakers, all made possible by our fantastic Sponsors!

I’ll be presenting “Intro to PowerShell” at 10:30am

We also have a mobile website for handy use on your phone during the conference, the url is easy to type too! http://spscsm.com
(If you’re an iphone user, once you have the site up, click the ‘share’ button, then ‘Add to Home Screen’ to create an ‘app’ icon for easy access)

I hope to see you there!

– Jack

Active Directory Migration Woes (Part 2)

In Part 1, I talked about a SP2013 farm that wasn’t behaving as a result of a lengthy in-progress AD migration, and linked to an article that talked about a hack, and then ultimately a solution via a SharePoint setting if your farm was patched to a patch level sometime around October 2012 or later. On the 2013 farm that resolved the issue and all were happy.

In Part 2, We visit the same issue on a SharePoint 2007 farm, and use the articles “hack” to resolve the issue in the 2007 environment.
For reference, the article who’s approach was used is here:
http://blogs.technet.com/b/craigf/archive/2012/10/15/sharepoint-and-sid-history-not-playing-well-together.aspx

The 2007 farm, unlike the 2013 farm doesn’t have the option of using the SharePoint setting – it doesn’t exist in the version of SP2007 in use, and a shotgun upgrade of the SP2007 farm is not an option.

Time to re-visit the “Hack” portion of the above article…

The Problem:
Two domains, Users in Domain A, SharePoint in Domain B
The users are going to be migrated one day, from A to B.
The AD team has replicated all the users so that they now exist in BOTH domain A and B.
Along with this replication, an attribute called “SID History” was set in domain B for each user.

Because of this, here’s how things work in Domain B…
If a program (any program, not just SharePoint) asks Domain B to fetch a user ID from a SID, Domain B will look in it’s list of users/Sids.
In Ordinary times, when users existed in only one place, the domain B controller would try itself first, and not finding the ID, would then send off the request to Domain A.

In our case however, the ID’s exist in both places.

So an app requests the User ID from a given SID from the DC in Domain B. This time, Domain B finds that SID in SID history. “AH HA” the domain says to itself. “I don’t have this ID, but I can see right here, that ID was migrated to an ID I DO have. I’ll just give you the DomainB\Account” (Remember we want DomainA\Account)

While that would be very helpful if we were done with our migration, and really really helpful if the Domain A account was deleted and gone, in our case we had a little problem.

The users were still using accounts from Domain A, so we needed those to resolve.

In the article, Craig Forster figured out that by using the right tool to query AD, you could pre-populate the LSA cache with the “desired” user accounts from the right domain controller, eliminating the whole name lookup and SID history piece of the puzzle.

Craig’s article mentioned the approach he used, but not the scripts.

That’s where this blog post comes in…
I crafted a solution to do what Craig’s article described.

First things first, the prerequisites:

  • PSGetSid.exe you can get this from Sysinternals
  • You’ll need PowerShell on the WFE’s for your 2007 Farm (regular PowerShell, we’re not touching SharePoint so there’s no need to panic that SP2007 doesn’t support SharePoint)
  • You’ll need the AD PowerShell module – this is a windows server
    “feature”, found under “remote server tools” when you add a feature.

Ok now for an overview of the solution:
From Craig’s article, we know we need to query not just the domain, but we need to look for every specific user.
To make this practical, we want to query AD for the list of users, then use the PSGetSid.exe program to query for that user.

start-transcript D:\adhack\RunLog.txt
import-module ActiveDirectory

$PSGETSID = "D:\ADHACK\PSgetSid.exe"
$batchFileName = "D:\ADHACK\FixMyAD.bat"

# this is to start the file clean so it's not appending between executions.
"Rem Start of file" | out-file -filepath $BatchFilename -Encoding "ASCII"

#Note, there is a good chance the next two registry commands will fail, for example, if they already exist - these are easy enough to set manually so thats what you should do, but leave theme here for reference

#set the timeout - this should be a number greater than the time it takes to run this script, for example, if the script takes 15 min to run and you schedule it to run every 20 min, then you'd want this to be something like 40 minutes so if it fails once, you'd still have the values in cache.
new-itemproperty -path HKLM:\SYSTEM\CurrentControlSet\Control\LSA\ -name LsaLookupCacheRefreshTime -PropertyType dword -value 40 #time in Minutes

#set the LSA cache (Default of 128 is too small)
new-itemproperty -path HKLM:\SYSTEM\CurrentControlSet\Control\LSA\ -name LsaLookupCacheMaxSize -PropertyType dword -value 131072 #this is the decimal value

#remember, you're running this from a WFE that's joined to the DESTINATION Domain

#first OU Example - query an OU in DESTDOMAIN (This would imply you have read rights to that OU from the WFE) 
$users = Get-ADUser -filter * -searchbase "OU=UserList,DC=DESTDOMAIN,DC=MYCOMPANY,DC=COM" | Select SamAccountName | Sort SamAccountName
foreach ($user in $users)
{
   #Here, note we're getting the user ID in TARGETDOMAIN, but issuing the command to SOURCEDOMAIN
   $act = "SOURCEDOMAIN\$($user.Samaccountname)"
  write-host "Attempting to cache $act" -foregroundcolor yellow
   & .\PSGetSid.exe $act   
   $line = "$PSGETSID $act"
   $line | out-file -filepath $BatchFileName -Encoding "ASCII" -Append 
   write-host "Done with $act" -foregroundcolor blue  
}


#Second OU EXAMPLE - querying the SOURCEDOMAIN when you've restricted access to the OU on the TARGETDOMAIN

#Need to specify the -server value name to query the SOURCEDOMAIN domain at server.

#Here what we're doing is going back to the original domain where the active user accounts exist - by active, I mean these are the ones they are logging in with each day.
#Doing this is similar, but note we need to specify the -Server parameter, and oddly, you don't actually specify a server name there, you specify the name of the Source Domain
#Also, as I wrote this, it occurred to me that it's quite possible that this query alone is doing the same thing as PSgetSid.exe, so maybe thats not needed in this case (Where the SOURCEDOMAIN is being queried) I'll have to test it one day...

$users = Get-ADUser -filter * -searchbase "OU=UserList,DC=SOURCEDOMAIN,DC=OLDCOMPANY,DC=COM" -Server SOURCEDOMAIN.OLDCOMPANY.COM | Select SamAccountName | Sort SamAccountName
foreach ($user in $users)
{
   $act = "SOURCEDOMAIN\$($user.Samaccountname)"
   write-host "Attempting to cache $act" -foregroundcolor yellow
   & .\PSGetSid.exe $act   
   $line = "$PSGETSID $act"
   $line | out-file -filepath $BatchFileName -Encoding "ASCII" -Append 
   write-host "Done with $act" -foregroundcolor blue  
}

write-host "Done $($users.count) users attempted to cache" -foregroundcolor green
stop-transcript

Ok so that’s the script – put it in the same folder as the PSGetSid.exe file

Now you might be wondering, what’s that FixMyAD.bat file it’s creating?
Well, it’s not really needed, but here’s the thought behind it – if you look at what it’s doing, FixMyAd.bat is just the PSGetSid.exe command repeated a bunch of times for each user in your environment. For a while, I was having trouble getting PSgetSid.exe to run when shelled out from Powershell, so I added that code with the thought that PS could generate the command and then a different job could run the batch file – it turned out to not be necessary, but I left it in there – it might be handy for some edge cases.

Normally, I’d schedule the PowerShell to run per the video Here: http://youtu.be/oJ4nktysxnE however, in this domain, it wasn’t having any of that – the security policy would not run unsigned scripts – I tried to change it, but that registry key was locked down. Darn!

Luckily I had another trick at my disposal: A small batch file launcher to launch PowerShell with the security disabled:

This is a non-powershell batch file that launches Powershell with an executionpolicy of bypass, then calls the PS1 file with the script and runs it.

pslauncher.bat:

powershell -Command "& {Set-ExecutionPolicy -scope currentuser -executionpolicy bypass}" -NoExit
powershell -Command "& {d:\ADHack\PreCache.ps1}" -NoExit

Note that I also had to set the scheduled task to “Run with highest privileges”

So thats the solution I ended up using to rig the LSA cache so it would display the correct user ID’s
Oh one more note- if you want to test if this is working, you might want to reset the LSA cache – as best I could tell, there is no way to do this, but I think you can set the two registry keys to zero, do a test to see what an uncached response is, then set the registry back and test again. No reboot was needed (the keys are documented at the top of the PowerShell script.)

Also, please accept my apologies if this whole thing doesn’t make sense etc… Craig did a great job explaining in his original article, so I didn’t want to rehash all that here, but did want to capture the solution I used since his didn’t show that.

Active Directory Migration Woes (Part 1)

The company I work for is undergoing a very long Active Directory migration project.
The result of which has included duplicate users in multiple domains, issues with Sid History, users not showing up in SharePoint, etc…

We’ve tried lots of things to work around the state of AD and one article was pretty critical for us:

http://blogs.technet.com/b/craigf/archive/2012/10/15/sharepoint-and-sid-history-not-playing-well-together.aspx

The true gem of the article isn’t the article itself, it’s in the comments from Brandon on the 17th of May 2013:

This is covered in the August 2012 CU (Note, this is also part of 2010SP2) … when you run this command: STSADM -o setproperty -pn HideInactiveProfiles -pv true it will bypass disabled accounts and query the active domain.

(Interestingly, that propertyname doesn’t show up when you invoke help on STSADM -o getproperty)

For some more background on what we did,

Our AD team made copies of all user accounts from “OLDDOMAIN” to “NEWDOMAIN” these copies also included “SidHistory” When this happened what we observed was it became impossible on SP2013 to pick an “OLDDOMAIN\user” – they would only show under “NEWDOMAIN\user” – since our accounts were migrated, but the users themselves were not yet using the migrated accounts (they were still logging on as “OLDDOMAIN\user”) this created a huge problem for the SharePoint team, and thousands of SharePoint users.

Part of the solution was that article, and the other part, was that the AD Team moved the duplicated accounts In “NEWDOMAIN” to a “Holding” Organizational Unit (OU) within AD (that OU was still in NEWDOMAIN), they then asked us for the service accounts we use for SharePoint and Denied access to that OU for those accounts.

The net effect of all of this work is that SharePoint 2013 now behaves as it would if there were not duplicated accounts on our domain. When we search a user, they only show up once, and from the “correct” users domain.

Now eventually, the AD team is going to ask users to start using the accounts in the “NEWDOMAIN” -when this happens, they will pull the account OUT of the “Holding” OU, making them visible to SharePoint, and they will also deactivate the old account in “OLDDOMAIN” which would prevent duplicates from showing up.

All the credit for this solution goes to the AD team I work with for the “Holding” OU and related permissions work, and also to Craig Forester for the Blog post with the original workaround and to Brandon Ryan for posting the property name. I’ve documented it here because it’s been so impactful for us, and wanted to be sure I had a permanent reference in case the original article is ever moved.

-Jack

I’m Presenting at SP24 on April 16th

I’m excited to be part of the 24 hour SharePoint Online Conference:

https://www.sp24conf.com

This is a 24 hour virtual online conference, that’s absolutely FREE!

The conference has 2 concurrent sessions, one business track and one technical track.

It starts at 10pm GMT on April 16th and runs through 11pm GMT on April 17th

For those from the Midwest US, that’s 5PM CDST on April 16th through 6PM CDST on April 17th.

I’m slotted for the first timeslot after the Keynote 6pm CST
More information available at https://www.sp24conf.com

PowerPoint:  SP24_JackFruh_PowerShellTimeMachine

Scripts: Scripts_SP24S006 SharePointPowerShellTimeMachine_JackFruh

PowerShell / SharePoint PS cheat Sheet: Combined PowerShell and SharePoint Cheat Sheet

Video on scheduling a PowerShell script to run weekly with the Windows Task Scheduler http://youtu.be/oJ4nktysxnE

How I figured it out video: http://youtu.be/0DDQszLSJ5w
This video is a 40 minute walkthrough of the process of creating a PowerShell Script from scratch.  It’s a great guide to see the techniques that are used to drill down through the object model, and explore parameters, variables and objects you might not be familiar with – all from within powershell.  A great skill to have for those times when searching the internet doesn’t turn up a script that meets your needs, and for furthering your understanding of SharePoint’s internals.

 

 

– Jack