Search This Blog

Thursday, March 27, 2014

Configuring a Windows Azure SQL Sync Group


Richard Green, Azure Ninja wrote a great guide on SQL in AZURE 

In this guide, I'm going to walk you through the process of setting up Windows Azure SQL Sync between two SQL Azure databases. This technology using Windows Azure SQL allows you to replicate databases either between other instances of Azure SQL databases or with on-premise SQL Server databases.
Richard J Green works as a Senior Technical Consultant for Infront Consulting specializing in delivering System Center solutions helping customers leverage their investment in IT with Microsoft technologies. Richard works extensively with Windows Azure and System Center.

You can follow more from Richard here
The operating system reported error 340: The supplied kernel information version is invalid.
SCCM Distribution Manager is not processing  Applications.

So I had a an issue with SCCM for the last few days whereby packages were failing to process.
I was getting an error that the SCCM services had no permission to the folder share.

When I was looking at the Status MSG's it gave this supper obscure msg...



It claims that the SCCM server does not have permissions to the share.


Now I checked that the 2 servers had full access to the folder and share so this just didn't make sense 



I went to the DIST manager log on the PS server and it told me this 


failed to create instance of IRdcLibrary

Well that's the issue right there

Remote Differential Compression is either not installed or not functioning on the primary site server.  As far as I know the PS server needs to use RDC to make a new file for the content library on the DP's.  In my case the PS server does not have any DP on it and I can only assume  that someone removed RDP not knowing that it was the PS server that takes the content use RDC and insert the new package into the content library on a DP.   

Anyway got RDP back on the server and all my packages started to process.


And my packet status goes green 


Thursday, March 13, 2014

SCOM 2012 HA OPTIONS
SCOM 2012 WITH SERVER 2012 R2 & SQL 2012 SP1

As a full time System Center consultant I am often asked about deploying products in the System Center suite in a multi-site global deployment.  The application that comes up top of the list is SCOM. Companies want to monitor data centers across the globe with SCOM deployments that are 100% Highly Available (HA).  Once we look at multi-site SCOM deployments we are going to naturally incur additional costs and complexity.  In this guide we will look at a diagram that looks at the components and shows the flow between each component.  We will then examine each configuration, looking at the pros and cons. 

Deploying SCOM with the differeNt Ha options

So in this section, we are going to walk through the different options for installing SCOM in a multi-site environment and what considerations you may need to take into account in designing and implementing your SQL Server or servers.  SCOM is a great product in the System Center suite to discuss because a lot of companies require multi-site monitoring and want to have the ability to have a HA SCOM whereby if they lose a primary data center they want monitoring to failover to a secondary datacenter or DR site.
We are going to start off with the most basic SCOM deployment and work our way up from there. If you are a seasoned SCOM pro please excuse the basic nature of this, but it will be helpful for others to understand what it’s all building on.
Just in case some of the common abbreviations are not familiar to you
MG      Management Group, the security realm between the MS and the SQL DB
MS      Management Server, the server that has a writable connection to the SQL DB
DB       Data Base, the SCOM, monitoring and reporting databases that are hosted by SQL
GW      Gateway, the SCOM, role that is used in remote locations to forward to a MS
MP      Management Pack, XML document that holds discoveries, rules, monitors and reports
SQL Licensing
Licensing is always a complex issue with System Center, and it doesn’t get easier with SQL, that being said I have been told from several sources that the no cost for SQL standard also applies for clustered instances of SQL standard only being used to house System Center DB’s. It was also confirmed to the MVP group that you can deploy SharePoint where it’s only purpose if to house System Center dashboards and there is no licensing requirement


Firstly we need a management server and a SQL Server to host the SCOM DB. A server we want to monitor, has a SCOM agent loaded on it, and it sends its monitoring data to the management server and the management server in turns writes that data to data bases on a SQL Server.  If we deploy SQL standard and it is only running to support System Center then there is no cost for the SQL license.  






















New SQL 2012 guide for System Center.

Some time ago I wrote a guide for deploying System Center onto SQL and what you may need to consider.
The guide came around from a heated discussion between myself and a DBA at a big company who was not happy with a SQL configuration I made.

When I looked at the data out there it was very difficult to get a clear picture on the full gambit of System Center products and how they would interact with each other.

The last guide was written before the R2 release of System Center 2012 and so a lot of the new SQL 2012 features were not supported.  I know it took a long time to get this guide together but its nearly 200 pages of content. The people involved with this guide were Pete Zerger did Azure, Robert Hedblom did DPM, Matthew Long did the SQL MP chapter, Craig Taylor did the SQL VMM template section, Richard Green and Craig worked with me on the general editing and work on the cluster builds etc.

 .

You can get the guide here


Monday, May 20, 2013

.Net3.5 and Server 2012

I have tried several ways to get the .Net Framework 3.5 in stalled on Server 2012 however the only way that ever works every time and is supper fast is powershell.
 On the Server 2012 media is a folder called sources\sxs so from an elevated command window I run
dism /online /enable-feature /feature:netfx3 /all /limitaccess /source:f:\sources\sxs as per the screenshot below. F:\SOURCES\SXS is just whereever you have the SXS folder located.

Now I was working on a project with installs around the world and it was proving hard to get the media to all the different sites and I was wondering if copying just the .net3.5 cabs from that SXS folder would do the job as its only 18Mb but alas it didn't work and I had to go back to the full folder.  What I now do is to load the SXS on a network share and just use the network path in a powershell script.


SQL Permissions Error on setup (The SQL service account login and password is not valid)

I was installing several SQL instances for a SCCM pre-prod environment that was going to mimic a production build but on a much smaller basis.  I was using a specific AD account and it had to span 3 domains in the same forest.  In the first domain (where the account existed) I had no issues. Once I went to the other 2 domains I kept getting the same 2 errors, namely

"The SQL Server service account login or password is not valid"
Or
"The credentials you provided for the Reporting Service are invalid"

I make as many mistakes as the next so I wrote down the username and password on notepad and then opened power shell as that user (so I deffo knew I was using the correct username and password)


I am crap at reading past error messages but the second error caught my eye
So it is telling me here that if I have an issue with the domain account I can go and change that in SQL Configuration Manager.  So I can perform my install with the standard built-in accounts and then change it when the install is done.  But the really important thing here is that  you can't go to services.msc and change the account that any of the SQL services run under as this will not allow SQL to properly configure the services.  One of the things that the SQL configuration does is grant the user the right to logon as a service. (that might have been my issue but it still does not make sense when SQL configuration manager can make the changes)

When you jump onto SQL Configuration services and select the SQL services and change the user account from the built in account to the domain account you want.


Now you can see that I have all the services changed to domain accounts, except for the Reporting Service
 To change the SSRS you need to logon to the SSRS Configuration Manager, before we can change the account to a domain account we need to backup the encryption key.
Now when I look at the SSRS page there is a domain account and not a built in account


And the same can be seen on the SQL configuration manager page 


 There are 2 last points here, if you are installing SCCM you must use network accounts you cannot use the build in SQL accounts and if you are running these services under a domain account then you will need to register the SPN.










To CAS or not to CAS....

I have been designing and deploying configuration manager for 12 years. In recent years I have been lucky enough to be working on some large scale SCCM projects.  So what is large scale to me? over 40,000 seats goes into the "large scale" category for me.  The CAS is nothing new to SCCM in 2007 we had the idea of the central admin site and in many ways it acted in the same way as the CAS. The CAS or Central Admin Server is basically a central admin location where you can manager multiple primary sites in one console.

When SCCM 2012 was released the CAS had to be installed during the initial install phase and as a result a lot of people deployed a CAS as a just in case design.  The CAS then received a lot of negative press because it implemented a layer of complexity that many didn't understand.  I am going to cover the physical things to do when deploying a CAS in another blog but in this blog we are just going to look at some of the design decisions you may go through when choosing to deploy a CAS.

A single Primary Site (PS) can support up to 100k clients and I have never deployed a SCCM environment with 100k seats but have installed a CAS 3 times so was I wrong to do that?  So let me give you a profile of 1 install.  The customer was a retail customer with a total of about 80k seats.  The client was expecting about 5% growth over the next 3 years and they have a support goal of never going past 80% of the Microsoft supportable limit and so we needed a CAS to support more than 1 PS.

My next install was for 45k and that is well within the 100k limit with a growth rate of about 15% over the next 3 years, with this deployment we still went for a CAS and 3 PS servers.  So if the CAS includes more complexity and more databases and servers then why chose a CAS?

It came down to the following points;

If I have a single PS server supporting say 50k clients and it goes down then we can add no new software, clients or OS images.  No new policies can sent to the clients and this means its a pretty high single point of failure for a large org.

In this example we have 1 CAS and 3 PS, the 3 PS were in the US, EU and APAC.  With this design we could loose the CAS and 2 PS servers and still have no issues with the last PS working.

There is no doubt that with the changes to SCCM SP1 and the ability to add a CAS after installing the first PS that for smaller sites its going to be more likely that more smaller sites will not go for a CAS to begin with unless you really need one, and who should decide that ... well you.

In SP1 there are also some good changes to how we can view what is happening from a site replication standpoint from within the monitoring component.