Showing posts with label SharePoint Administration. Show all posts
Showing posts with label SharePoint Administration. Show all posts

Sunday, 24 September 2017

Network Share Migration to SharePoint Online

Network Share Migration to SharePoint Online using SharePoint Online Management Shell via Azure Storage

The Problem

A large number and volume of files on a local network share need to be migrated to SharePoint Online. Moving this way is automated from click to upload complete and any document properties need to be carried with the files.

The Approach

Install SPMS https://www.microsoft.com/en-us/download/details.aspx?id=35588

Sign Up for and Create an Azure Storage Account (fees associated) https://manage.windowsazure.com/site.onmicrosoft.com
note the account name and account key in the script (the account name and the account key can be found at office 365 admin center > import > + > upload files over the network)
# Azure Storage information
$azureStorageAccountName = ""
$azureStorageAccountKey = ""

Create a local folder C:\temp or similar to hold the xml files and note the network document folder to upload in the sources of the script.
$srcPath = "\\network share path\"
$pkgPath = "C:\temp\SourcePkg"
$targetPkg = "C:\temp\TargetPkg"

Run SPMS 'as Administrator', change directory to the location of the script (I use c:\temp to keep it all together.
Run the script using .\pscriptname.ps1:

$cred365 = (Get-Credentials "adminuser@mysite.onmicrosoft.com")
-This will pop-up a window to get the password for the Office 365 Admin account and store the credentials object in the variable for later use. 

$credSP = (Get-Credentials "adminuser@mysite.sharepoint.com")
-This will pop-up a window to get the password for the SharePoint account and store the credentials object in the variable for later use. This account should have Site Administrator permissions on the destination site.

The Result

Import-Module Microsoft.PowerShell.Security
# SharePoint Site Administrator account
$credSP = (Get-Credential "admin@site.sharepoint.com")
# Office 365 Administrator account
$cred365 = (Get-Credential "admin@domain.onmicrosoft.com")
# Azure Storage information
$azureStorageAccountName = "storagename"
$azureStorageAccountKey = "accountkey"
#sources
$srcPath = "\\network share path\"
$pkgPath = "C:\temp\SourcePkg"
$targetPkg = "C:\temp\TargetPkg"
#subsite in SharePoint Online
$targetWeb = "https://site.sharepoint.com/subsiterootpath/"
#target library in the above site
$targetLibrary = "library name"
#Create package and upload to Azure Temporary Storage 
New-SPOMigrationPackage -SourceFilesPath $srcPath -OutputPackagePath $pkgPath -NoAzureAdLookup
#Ready package for migration from Azure Storage to Destination
$sourceinAZstore = Set-SPOMigrationPackageAzureSource -SourceFilesPath $srcPath -SourcePackagePath $targetPkg -AccountName $azureStorageAccountName -AccountKey $azureStorageAccountKey -AzureQueueName "fs2sp"
#Submit above Package for Migration to Destination
Submit-SPOMigrationJob -TargetWebUrl $targetWeb -MigrationPackageAzureLocations $sourceinAZstore -Credentials $credSP
#The fully qualified URL and SAS token representing the Azure Storage Reporting Queue where import operations will list events during import.
$myQueueUri = <uri to azure report queue>
# In testing this doesn't appear to do much but is required to prevent errors
Get-SPOMigrationJobProgress -AzureQueueUri $myQueueUri

Going Forward

Script could be called as a function with the variables passed as parameters to allow uploading from multiple locations to multiple site libraries as a batch.

Error capturing, logging could be expanded and implemented.

Wednesday, 11 May 2016

Set Up Domain Name which uses a Dynamic IP

Set Up Domain Name which uses a Dynamic IP 

The Problem

I have finally broken down and purchased a couple domain names (to cover <mydomain>.com and <mydomain>.ca) - they were quite simply too cheap not to buy. The problem is I have residential high-speed service which uses dynamic IP assignment to my modem and even though I am only using the dns/ip for development and testing of work at this point, I need the name to resolve to a web server on a Hyper-V virtual machine and to update the ip in my nameserver records whenever it changes. Static record pointing to a dynamic IP which resolves internally to a dynamic IIS site on a dynamic Hyper-v server. Wee!

The Approach

The first step is to find a free or cheap way to automate my A records. A bit of Googling turned up a couple of options. I found DynDNS to be pretty expensive for my needs (it cost more than the price of the domains at the time of writing ...) so opted for EntryDNS. The one time donation / fee is very appealing and the few reviews I checked out gave this site high praise. The whole point of this service is to manage A records and to allow dynamic update of IPs using basic url requests. So I signed up.

Next I went to the registar of my domains and dug through the documentation until I found how to assign third party name servers to my record and changed them to ns1.entrydns.net and ns2.entrydns.net ... then waited for a full day for this change to propogate (I guess there was a reason for the sale ...)

In the mean time I set up my router, modem and server (hyper-v host running Win 10 pro) to port forward requests on port 80 through 443:
Modem - dynamic external IP, internal dynamic (192.168.x.x) IP assignment. Setup port forwarding of all incoming port 80 to 443 requests to the IP of my internal sub-net Router's IP (192.168.0.1)
Router, dynamic external IP (192.168.0.1 preferred), internal static IP ranges 10.0.0.X/24  I set up port forwarding (192.168.0.1/24:80) to my Hyper-v server (10.0.0.4/24:80).

As the server has Hyper-v on Windows 10 Pro, I decided to use a NAT on that server to manage Hyper-v development environments. A simple solution provided by reading:  https://4sysops.com/archives/native-nat-in-windows-10-hyper-v-using-a-nat-virtual-switch/. and then running the following PowerShell on the Windows 10 Server (as Admin):

$serverExternalIP = "10.0.0.4"
#Internal IP of the host Windows 10 server 

$serverInternalIP = "10.0.99.1
#the 10.0.99.x part must be the same as the $IPRange below and the .x part must be 1
$IPRange = "10.0.99.0/24"
# NAT sub-domain to be used with this connection

$DestinationHyperVServer = "10.0.99.225"
# this is the IP of the Hyper-v Server with the IIS service running (and SharePoint in my case)

# Run Get-NetNat first to see if you already have a defined net service running
New-VMSwitch -Name "NAT" -SwitchType NAT -NATSubnetAddress $IPRange
New-NetIPAddress –IPAddress $serverInternalIP -PrefixLength 24 -InterfaceAlias "vEthernet (NATSwitch)"
# change the IP address of the host, Windows 10 Server to match the range with the first IP
New-NetNat –Name NAT –InternalIPInterfaceAddressPrefix $IPRange

Add-NetNatStaticMapping -NatName “NAT” -Protocol TCP -ExternalIPAddress 0.0.0.0 -InternalIPAddress $DestinationHyperVServer -InternalPort 80 -ExternalPort 80
Add-NetNatExternalAddress -NatName "NAT" -IPAddress $serverExternalIP -PortStart 80 -PortEnd 443
Get-NetNatExternalAddress
Get-NetNatStaticMapping 
Get-NetNat

Next I set up a basic Hyper-V farm consisting of 3 servers (using the 2012 server r2 developer trial) using the Virtual Switch NAT created by the PowerShell as follows:

AD (dns and Active Directory Services for my two purchased domains - all settings in PowerShell to enable quick building) IP set internally to 10.0.99.2, Subnet 255.255.255.0 and Gateway set to the Host system internal NAT IP (10.0.99.1 in this case)
CLS set to use 10.0.99.3 in DNS. 

IP4 settings of DNS servers set to Google (8.8.8.8 and 8.8.4.4) with the DNS suffix of my domain appended in advanced dns settings (not sure if this is necessary but it doesn't hurt so ...)

Then I joined the server to the newly created Domain.

Left the window open to allow record adding when other servers join the domain. 

IIS (IIS/SharePoint development server)
Pretty much the same IP settings except the IP address was set to 10.0.99.225 and the Alternate DNS server ip set to 10.0.99.2 (so the internal DNS service resolves internal names). IIS initially set up to just HTTP and no bindings.

SQL (Sql Server 2012) IP in the unused 10.0.99.x range, same settings as the IIS server other than that.

Testing is pretty simple - browse to localhost and 10.0.99.225 from the IIS machine - both should produce a site. Browse to 10.0.99.225 on SQL and AD - should see the same site. From the hyper-v host, browse to 10.0.99.225, 10.0.0.4 (whatever it's internal ip address is) and whatever your external IP is (from the modem) - all three should open the same site. 

Once the nameservers have refreshed and your whois record shows them as being used, test the domain name on both the server and host level.

For SharePoint I set up AAM for each of the subsites to go to different applications at the same IP and this worked well but will not go into detail here.

Now came the fun part. I created the following PowerShell script to handle updating the dynamic IP recorded on the EntryDNS service. This uses the Doman record Tokens in a string array - I'll likely change this to read an input file of either CSV or XML delimited values.

The global logic is as follows:
Get the current public internet IP (from checkip.dyndns.com), compare to the last recorded IP address registered and if different update the EntryDNS entry and the IP record. Then set up a timed job to run the script on a regular interval.

The code:
#DNS Codes for each domain entered in EntryDNS
    $Codes = ("put the codes here")
#this is a simple text file with a single IP address inside it Change to any location you like for a temp file (C:\temp or something)
    $IpFile = "C:\IP.txt"
    
    # set to true to debug the settings. Logging is very verbose so not a good idea to leave on.
    $logging = $false
    $logpath = "C:\testlogs\"

function UpdateEntryDNS ($Code, $currentIP) {
    try {
        $req = "https://entrydns.net/records/modify/$Code"+"?ip="+"$currentIP"
        # Builds the url to update the IP address
        $r = Invoke-WebRequest $req
        if ($logging -eq $true) {
            $dt = Get-Date -Format g | foreach {$_ -replace ":", "."}
            $fn = "$logpath"+$dt+$Code+".txt"

            # if the path doesn't exist, create the directory
            if ((Test-Path($logpath)) -eq $false) {mkdir $logpath}

            # output file and populate with details
            # interval of checking should be at least 1 min apart or files get overwritten
            $Logoutput =  "$dt "+ $r.StatusCode +" " +$r.Content+" "+$req  | Out-File $fn -Force
        }
        Write-Host "Updated"
        }
        catch {
    
        }
}    
function Get-ExternalIP {
# the following parses the returned page
# changing the source will mean parsing the returned content appropriately.
    $Ip =(Invoke-WebRequest "checkip.dyndns.com").Content
    $Ip2 = $Ip.ToString() 
    $ip3 = $Ip2.Split(" ") 
    $ip4 = $ip3[5] 
    $ip5 = $ip4.replace("</body>","") 
    $curIP1 = $ip5.replace("</html>","")
    $curIP = ($curIP1.replace("`n","")).Trim()
    return $curIP
}


$currentIP = Get-ExternalIP

# if the file exists, read it - otherwise run once with null IP and create file with current IP
if (Test-Path($IpFile)) {
    $content = ([IO.File]::ReadAllText($IpFile)).Trim()
}
else {
    $content = "0.0.0.0"
}
# New IP
if ([IPAddress]$content -ne [IPAddress]$currentIP) {
    Write-Host "Updating ... "
    Write-Host "Old IP: " $content
    Write-Host "Current IP: " $currentIP
    # (Over)Write new Ip Addres to file 
    $currentIP | Out-File $IpFile -Force

    # update dns for all in list
    ForEach ($Code in $Codes) {UpdateEntryDNS $Code $currentIP}
}
# No New IP
else {
    Write-Host "Equal"
}

Running the above will create the file C:\IP.txt which contains the most current IP address, a folder located at C:\testlogs\ if logging is changed to $true with a timestamp+code named file for each update attempted - inside each is the return code and content message (200 and OK are the desired results here) as well as the combined URL used for debugging purposes.

Now that this has been created and tested, save it to a good location for custom scripts on the Windows 10 Host system (C:\powershellscripts for instance) and set up a timed event to run the script.

Open "Administrative Tools' and 'Task Scheduler' and click 'Create Task'


Give it a meaningful name

Create a trigger and set up the timing as suits your environment (no more than once every 5 minutes is recommended and remember to enable 'Stop Task if it runs for ...' to kill any hanging processes eventually. Click OK.


Next Add the Action. Action Tab -> New -> select Start a program -> Find the script above, click OK.

Click ok and ok and close all those nasty windows!

Voila! Minimal downtime when the IP changes.

Note that one still needs to add DNS entries for subsites in SharePoint (in AAM) and in DNS (EntryDNS which adds more update codes to the list) and in IIS (to some degree). Ideally a Site creation script used in this environment will do all of that in one step.

The Result

A relatively simple PowerShell script which can be reused on a timed basis to keep my IP up to date with my server.

Going Forward

Potentially use a csv / text file to manage multiple domains, sub-sites and keys to update the A record of each (in the absence of figuring a way to use a wildcard).
If other free-ish services are available, add routines for each so a simple toggle can be selected in the script for parsing of both the IP resolution and the entryDNS site.

If this helped, let me know! If it is broken, Let me know!

Tuesday, 5 April 2016

PowerShell SharePoint Backup and Restore made automatic

PowerShell SharePoint Backup and Restore

The Problem

I need an automatic method of backing up and restoring my SharePoint 2013 farm (on prem) to the same or a different farm (for creating a UAT/DEV environment refresh for instance ...)

The Approach

Use PowerShell to create a fast backup and create a fast restore.

The Result

First create a folder on one of the SharePoint servers (C:\SP_Backups in this example).

Next, the current logged in user needs to have securityadmin fixed server role on the SQL Server instance, db_owner fixed database role on all databases that are to be updated / accessed, be in the Administrators group on the server on which you are running the Windows PowerShell cmdlets ... in other words a member of the SPShellAdministrators so if this fails due to some kind of access or privvy errors - have the farm admin add the user running the script using:
Add-SPShellAdmin -UserName $domain\$user
When done the same Admin can remove the user using:
Remove-SPShellAdmin -UserName $domain\$user

Save the following to a ps1 file in the newly created folder above. Edit the path variables (highlighted below) and run it from an admin PS window (using .\<filename>.ps1)

# REQUIRES that the account running this has:
# - securityadmin fixed server role on the SQL Server instance
# - db_owner fixed database role on all databases that are to be updated.
# - Administrators group on the server on which you are running the Windows PowerShell cmdlets.
# can add by An administrator using the Add-SPShellAdmin cmdlet to grant permissions to use SharePoint 2013 cmdlets

# Add-SPShellAdmin -UserName $domain\$user
#
# load up the SharePoint snap-in if not loaded properly
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}
# get some variables
$stamp = Get-Date -Format yyyy-MM-dd
# network share to copy completed backup to
$serverdir = "\\SERVER\Shared\baersland-farm-backup"
# local machine dir to create backup to
$dir = "\\SP13\SP_Backups\$stamp"
# create a new subfolder based on year-month-day
New-Item $dir -ItemType directory -Force

$ctdb = Get-SPContentDatabase
# create the restore powershell script in the newly created folder
Out-File -FilePath $dir\1-run_me_to_restore.ps1 -Force -InputObject "# Restore Script created:  $stamp"
Out-File -FilePath $dir\1-run_me_to_restore.ps1 -Append -InputObject "# Optional for moving to another farm -FarmCredentials domain\user -NewDatabaseServer newdbserver"
foreach ($_ in $ctdb) {
    $name = $_.Name
    $guid = $_.Id
    Write-Host "processing ... $name"
    Backup-SPFarm -Directory $dir -BackupMethod Full -Item $_.Name -Verbose
    Write-Host "backup of $name success!!"
    Out-File -FilePath $dir\1-run_me_to_restore.ps1 -Append -InputObject "Restore-SPFarm -Directory $dir -RestoreMethod Overwrite -Item $name -Verbose -Confirm $false"
}
# move from local to network share
Copy-Item -Path $dir -Destination $serverdir -Recurse -Force
Clear-Host
Write-Host "░░░░░░░░░░░░░░░░░░░░░█████████
░░███████░░░░░░░░░░███▒▒▒▒▒▒▒▒███
░░█▒▒▒▒▒█░░░░░░░███▒▒▒▒▒▒▒▒▒▒▒▒▒███
░░░█▒▒▒▒▒▒█░░░░██▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒██
░░░░█▒▒▒▒▒█░░░██▒▒▒▒▒██▒▒▒▒▒▒██▒▒▒▒▒███
░░░░░█▒▒▒█░░░█▒▒▒▒▒▒████▒▒▒▒████▒▒▒▒▒▒██
░░░█████████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒██
░░░█▒▒▒▒▒▒▒▒▒▒▒▒█▒▒▒▒▒▒▒▒▒█▒▒▒▒▒▒▒▒▒▒▒██
░██▒▒▒▒▒▒▒▒▒▒▒▒▒█▒▒▒██▒▒▒▒▒▒▒▒▒▒██▒▒▒▒██
██▒▒▒███████████▒▒▒▒▒██▒▒▒▒▒▒▒▒██▒▒▒▒▒██
█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒█▒▒▒▒▒▒████████▒▒▒▒▒▒▒██
██▒▒▒▒▒▒▒▒▒▒▒▒▒▒█▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒██
░█▒▒▒███████████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒██
░██▒▒▒▒▒▒▒▒▒▒████▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒▒█
░░████████████░░░█████████████████" -ForegroundColor Green

The lat bit is just a fun way of knowing when the script and backup have worked.

Restoring is simple. Open the date stamped folder and run the script this script created <path>/1-run_me_to_restore.ps1. If moving to another environment with different user credentials for the farm and a different database server ... recomended ... edit the file by removing the comment and adding values for the required info:
# Optional for moving to another farm -FarmCredentials domain\user -NewDatabaseServer newdbserver

Going Forward

Use parameters and can the script to be called as part of a toolset.
Set up a script to add this as a scheduled task on one of the servers.
Add error handling (try/catch with a nasty catch ascii image!)