Mapping Onedrive for business

What if I wan’t to access my onedrive files in Windows Explorer, not Internet Explorer or the Onedrive for business app. The Onedrive for business App works fine (sometimes) for accessing the files from a laptop or workstation but it syncronizes the contents to the local drive which is fine in. But when the users are working in a Remote Desktop or Citrix Environment you do not want to synronize every users ondrive files to the session host, accessing the files from a web browser is also not optimal.

This article will show you how to access your onedrive for business files by adding a networkplace. This guide is based on a Remote Desktop farm based on Windows Server 2012 R2, the users are using ADFS with SSO to connect to Office 365, this is not required but I have not tested it in any other way.

Add https://*.sharepoint.com to your trusted sites zone in internet Explorer, I do this by using Group policy under the following path.

User Configuration/Administrative Templates/Windows Components/Internet Explorer/Security Page

Site Zone Assignment List enter

Value Name: https://*.sharepoint.com Value: 2





Quick way to extend your self-signed certificate using powershell

Certificates are not my favorite thing and I guess I am not the only one. This method to renew a self-signed certificate made life a bit easier.

Scenario: Certificate needs to be renewed and distributed using group policy to client computers.

First thing: Get the thumbprint from the old certificate.

Open Powershell in administrative mode:

Copy the thumbprint from the certificate you want to renew.

Voila, a copy of the old certificate is created valid for one year.

To export the certificate for distribution using group policy manager, just copy the new thumbprint and run the following command.

Now there are some things to wish for, I wish I could generate a certificate with more than one years validity but….you can’t get everything.

The “getting rid of old crap” way of moving from ADFS 2.0 and dirsync to ADFS 3.0 and AAD Sync.

This gave me a headache so I need to write this down. The project meant moving an existing ADFS 2.0 and dirsync installation without TMG to a new install made up of ADFS 3.0 and AAD Sync on one server and WAP on one. I started of using Windows Internal Database (WID) for ADFS and SQL LocalDB was installed by AADSync, when importing the configuration from the ADFS 2.0 server the ADFS 3.0 service would not start and gave event error 220, looking at the configuration file for ADFS it looked as I had imported way to much configuration information than I wanted to. I then proceeded to an alternate migration path.


After resetting the new ADFS server I installed SQL Server Exress 2014 with tools and configured a default instance on the server so that I didn’t have to be concerned about having both Windows Internal Database (for ADFS) and SQL LocalDB (for AAD Sync). I used the latest version of AAD Connect for the installation since it does most of the job for you.

I started of leaving the external mapping of port 443 for the external dns record fs.customerdomain.com pointing to the old ADFS server and logged in on the old ADFS server. In the ADFS 2.0 server I started “Microsoft Online Services Module for Windows Powershell” as an administrator.

I connected to the customer tenant using the following commands.

The next step was to convert the already federated domain customerdomain.com to a standard Managed domain WITHOUT converting the users to standard users. This was achieved by the following command.

The passwordfile is not created when using the -SkipUserConversion parameter but the command didn’t let me skip it so I just added it anyway. This got rid of the federation configuration for the domain in the Office 365 tenant so the AAD Connect wizard can configure the new federation for the domain to the new servers. The next step was exporting the certificate for fs.customerdomain.com with the private key to a PFX file from the old ADFS 2.0 server and importing it in the Local computer perosnal store on the new, soon to be, ADFS 3.0 server. After that I changed the external mapping of port 443 from the internal IP of the old ADFS 2.0 server to point on the internal ip of the soon to be Web Application Proxy Server. Since I use split DNS I also had to change the internal record for fs.customerdomain.com to point to the internal IP of the ADFS 3.0 server since internal authentication request does not need to pass the Web Application Proxy. I also stopped and disabled the services for the old ADFS 2.0 server and the old dirsync services.


After this was done I started the AAD Connect wizard and it worked all the way. Since the customer did not syncronize the whole directory to Office 365 it was important to remember to uncheck the box “Start the syncronization process as soon as the configuration completes” at the end of the wizard and then start the AAD Sync UI and configure the correct OU to syncronize., after that is done you enable the task for AAD sync in scheduled tasks.


Manage test and production groups in WSUS from powershell with PoshWSUS

I have always found the WSUS interface lacking in functionality for handling multiple server groups with test and production environments and here comes PoshWSUS in real handy.

I will show you basics on how to manage test and production groups and verify patch assignment.

You can find the PoshWSUS module here

Loading the module and connecting to your WSUS server

Select the group for your test environment that already got the latest patches assigned and deployed where patch functionality is verified

Get all the patches from your test environment and assign them to your production group

When the script is done running you will have an output list on all the assigned patches to the production group.

Lets verify that the production environment has the same patches applied as the test environment and list missing patches if found

If no patches are returned then your test environment now match your production environment of assigned patches



Checking multiple scheduled jobs status on multiple servers.

I got tired of not knowing the status of several scheduled jobs scattered across multiple servers. This script resolves that by checking the last run status of each job I define in the input file and sending me an alert if something went wrong.

After struggeling a while trying to find a way to read the last status of a job I ended up using schtasks.exe in the script, it turned out that I could not retrieve the last status code for a job created using the scheduled tasks wizard using WMI. The script is pretty basic, it uses an input file, a regular txt file, where I list the servername and then the jobname in the following format.


I create an output file just because it might be handy if the e-mail somehow gets lost on the way. I then run a foreach to process every line in the provided txt file and if the jobstatus is anything other than ”0” I ouput that to a file and send an e-mail with a customized message containing the server name, the job name and the last status code.

Without further delay..here is the script.



Import specific GPO through powershell from the Backup all script


This script is based on the backup all gpo script done here

The script will loop through all the GPO and you will be presented with a pop-up asking if you wish to import the specific GPO.

Backup all GPO in domain through Powershell


Bulk Creation of Room mailboxes Using Powershell


The question came during a meeting, who want’s to be responsible for creating the room mailboxes? I figured it would be interesting to brush up on powershell so I grabbed it. I started of creating a csv file containing the following headers.


Displayname, UserPrincipalName,Office,ResourceCapacity,OrganizationalUnit


Regarding Office, this is what shows up in the addresslist so if you have several locations it is a good idea to populate this property. When the csv file was ready (make sure it’s comma-seperated, not semicolon) I ran the following command.

import-csv [pathtocsv] | foreach-object {new-mailbox -Name $_.DisplayName -DisplayName $_.DisplayName -UserPrincipalName $_.UserPrincipalName -Office $_.Office -ResourceCapacity $_.ResourceCapacity -OrganizationalUnit $_.OrganizatinalUnit -Room}


Since the Rooms are not active until a couple of weeks I ran the following command to hide them from the addresslists. this csv contained the alias of the room mailboxes.

import-csv [pathtocsv] | foreach-object {set-mailboxfolderpermission -Identity $_.Alias -HiddenFromAddressListsEnabled $true}


Since I want all users to see who the meeting organizer is I created one more csv file containing the alias of the rooms and added the following tail.



import-csv [pathtocsv] | foreach-object {set-mailboxfolderpermission $_.mbxident -User Default -AccessRights ReadItems}


Looking back you could put all values in the same csv file to prepare everything before you run the commands, the csv file would then look something like this.

Displayname, UserPrincipalName,Office,ResourceCapacity,OrganizationalUnit,Alias,mbxident


When formating the csv file dont forget to put ” ” if you have spaces and make sure you dont have trailing white spaces.



Add multiple Windows patches to WIM using DISM and Powershell


SCCM and Adobe Reader XI with Powershell


How to distribute Adobe Reader XI in System Center

1. Download the .msi package from adobe  here

2. Download the Adobe Customization Wizard XI here

3. Do you customization to fit your IT needs in Customization Wizard and generate a .mst file

4. Create a powershell file with the following code

Page 1 of 212
Go to Top