PowerShell on the River 2019 #PSOTR

Lunch time crowd taking a break

On August 9th and 10th I had the wonderful privilege to attend and speak at the 2nd annual PowerShell conference in Chattanooga, TN. It was held again at the great venue of Chattanooga State Community College. It was a great event put on by @PSJamesP , @LittlejohnPSH , and a great group of folks who knocked it out of the park again this year!

Friday the all day session I attended was Advanced PowerShell Scripting WorkShop given by none other than @JeffHicks . I learned even more about best practices when building new scripts and have included my notes in my new GitHub repo here.

On Saturday, it was a busy time of multiple short 45 minute talks given by a great group of speakers:

@Schlauge -Search Event Logs Like a Boss

@JeffHicks – An Intro to Desired State Configuration

@steviecoaster – Choco for Beginners

@techTrainerTim – Git and Github: Newbie to Intermediate contributor in 45 minutes

@nocentino – Containers you better get on board

Lastly was my talk given about SharePoint Online PowerShell intro.

Links to the repo for the entire event can be found here

All my notes for my session can be found here

Once again thanks to everyone who spoke and put on this event, everyone did a great job and I look forward to next years event very very soon!

Advertisements

PowerShell on the River coming up, August 9-10!

This is a quick reminder that there’s a great local event coming up on August 9 and 10th in Chattanooga, TN if you are free for a full two days of learning PowerShell and various other topics related to PowerShell.

Check out the link here

Don’t forget to sign up soon and see you there!!!

SQL Saturday Chattanooga 2019

Back on June 21st and 22nd, I was able to attend a great event in Chattanooga given at the Chattanooga State Community College.

As the homepage describes, the two day event consisted of an all day workshop on Friday and then a series of short presentation sessions throughout the day on Saturday.

During the Friday session I attended a wonderful session give by Edwin Sarmiento on the SQL Server DBA’s guide to Docker. I thoroughly enjoyed learning the concepts of Docker and how they can be used to run instances of SQL Server on both Windows Server and Linux. The ability to now run multiple instances of SQL Server in containers is really fascinating. I can see how businesses would want to take advantage of this new technology on Linux in order to avoid some of the normal costs associated with running SQL in an IT environment.

The Saturday sessions were even better. From the listings here:

Leveraging Python with SQL Server

Using your on prem data in the cloud

How do you azure?

Introduction to serverless architecture

Tips and Tricks for the Powershell DBA

I thoroughly enjoyed the event and wanted to personally say thanks to everyone! Especially to the presenters and volunteers who helped put on the event.

End of the day give away!!!

SPTechCon Austin 2019

Back on February 10-13th I had the privilege of attending the SPTechCon conference in Austin, Texas for the first time. Boy what a great conference! Even though the weather for the week was rainy and cold for this time in Austin, the content and learning opportunities more than made up for it.

I arrived on Saturday the 9th to get ready for the Office 365 Kitchen event that was happening all day Sunday. The Office 365 Kitchen event was the best part of the entire conference. Imagine an all-day session with at least 8 MVP’s from Microsoft where they help you choose one of 5 real life scenarios that could be fixed using various tools in O365. Then you work in groups with other attendees to come up with a solution to your specific problem and then at the end of the day, present what you created to the larger group.

My group worked specifically on how to create a Microsoft Teams help area using the product that would not only on-board new users, but allow existing users within an organization to use different resources surfaced under Teams to learn how to better use it. I know I will be using a lot of these same lessons to create a setup like this when I get back to the office.

O365 Kitchen

Monday was a great day of sessions, starting off with learning about Desired State Configuration (DSC) in PowerShell by Nik Charlebois to keep a consistent configuration on you SharePoint servers. This is definitely a great way to keep up with, and help to prevent unwanted changes in your SharePoint farm.

The next talk of note was a good discussion of Investigation and Forensics in Office 365 by Liam Cleary, which showed of some great features and functionality on how to find out what happens when a security incident occurs in Office 365.

After that in the afternoon was a great PowerShell session from the dynamic duo Todd Klindt and Shane Young. All the good stuff was here, like knowing what to do with passwords in scripts, using PSReadLine and even discussing the PnP Module add-in to improve how things work in PowerShell for SharePoint.

Tuesday started off with a great PowerApps and Flow discussion with Shane Young and went into some real good depth on how the product has changed since I last saw it and had been able to kind of kick the tires on it. I can tell PowerApps is definitely more well-rounded than even 3 months ago and that improvements continue to roll out for it.

The next session was great keynote speech on the future of tech by Rima Reyes. Rima is a great speaker and told a great personal story of how she worked her way up the ranks as a former SharePoint admin for The White House to where she is now working with Microsoft Teams. If you ever get the chance to hear it, don’t miss out.

My last session on Tuesday, was another session by Liam Cleary on the SharePoint PnP PowerShell module. Liam always gives a good overview and introduction to this. He gave great descriptions over some of the possibilities of the Patterns and Practices module. Some that I’ve never really considered, but now I know I have to try out.

Wednesday the final day started off with a great discussion from Marc Anderson and using PowerApps to replace SharePoint List forms. I especially liked the demo he gave that used the map and location tech on a user’s phone to be able to integrate info into a SharePoint list for, now that’s cool.

Next was a session on PowerBI by Treb Gatte and he gave some great insights on pulling SharePoint related data into PowerBI. Treb also gave a good explanation on the different levels of PowerBI and how to tell them apart based on what kinds of reports or features you are trying to use. I’m going to have to research more on PowerBI.

The last two sessions I had in the day were both around bot technology and how to use them in the context of Azure and Office 365. One session was by Robert German and the second was by Matt Wade. Bots are something I definitely see becoming more and more important as companies try to automate more  and develop different processes in the future. Both these speakers gave great examples and made we want to dig in and learn more going forward.

And that wraps up my time in Austin at SPTechCon for 2019. I encourage anyone who is interested in going to check out the next session for this later in the year in Boston for SPTechCon. It runs from August 25-28 of this year. You can learn more about it by going to https://www.sptechcon.com/east.

Compare 2 similar Excel files with the Inquire Add-In

Note: this was only tested in Excel 2019. Office 365 Pro Plus has something called Compare sheets similar to this function

I had the need the other day to compare two excel files that had some information in them from exported SharePoint lists. They were very similar, but I didn’t know of a good way to compare each of them to the other since there were about 200-ish rows in each list, so sorting through them would take a while.

That is until I ran across the compare functionality in Excel that could be turned on my making the Inquire add-in visible.

After some searching I came across this:

https://support.office.com/en-us/article/turn-on-the-inquire-add-in-6bc668e2-f3c6-4729-8ce1-75ea20aa9d90

First I had to go here to options:
After that, I had to go to Add-Ins and from the Manage drop down, select COM Add-ins and click Go…:
Then I was able to check the box on Inquire and make the tab show up now in Excel:
Once enabled you should see this tab with the following options:
From here you can select which files to compare:
After selecting the Compare button, you may get the option to save both files and once it runs your output should look something like this:

In my case I was comparing url’s so each time I clicked on one url on the left hand, if it existed in the other worksheet, it would be highlighted on the right hand side.

Very handy if you have to compare similar info from Excel sheets. It saved me some time, and I hope it does the same for you.

Enjoy! -BJ

Year in Review 2018

Reflecting back on the previous year was again a year of big changes for me. From a professional standpoint I changed jobs mid year in 2018 and I also took on a new role working from home. This has been a huge change for me and has taken some time to get comfortable with and I have a great new appreciation for those folks doing it and who have been able to do it for an extended period.

From a travelling standpoint, I didn’t do much of it other than flying a couple of times to the Midwest for the new company I work for. So again the new job and new schedule had a lot to do with my lack of travel this year.

On the community side of things I was able to attend PowerShell Saturday Chattanooga for the first time. If you want to read more about that you can read about it here.

I also attended in May the events in Nashville for Cloud Friday and SharePoint Saturday Nashville. Both of these events were excellently done. Some of the best conference food I have ever had was here. I wanted to do a big shout out to @JimBobHoward for putting on a couple of great events! I really enjoyed both and learned more from both events as well!

If you have a chance to attend either one of these events, I highly recommend it.

This year was also the first time I went through my first Microsoft 365 migration. I am currently still learning a lot about 365 and what really blows me away if how many services are involved in the total offering.

Trying to keep up on all the latest updates with this many services is already a struggle, but it’s exciting at the same time for the pace of changes and how often the service is updated.

Finally this year I dipped my toe into the GitHub waters. I created a repository online that I will gradually start posting my awfully written scripts and other helpful info to as well.

Keep an eye out there and I will try to mention those by twitter and other places as I add new content to it.

Thanks again to everyone for helping grow my knowledge in the community and I hope my posts here have helped increase others knowledge as well.

Hope everyone has a happy and healthy 2019!

-BJ

Book Review – The Phoenix Project

I recently finished this book since I started a new job back in June and it has had a big impact on my thought processes since then.

Before I delve much further into the book itself, I think this book should be required reading for anyone getting into the IT field. It definitely gives you some real to life (at least for me) examples and scenarios of what to expect and in some cases, how to overcome them. I cannot recommend this book enough just on the examples given of how projects are run and maintained in IT and in many businesses today.

For me personally, I definitely identified with many of the characters in the story as I have now been working in IT in some form or fashion since 2006. Some days are better when you can be leading a project to completion and getting high fives all around the office. Then, there are some days where its 3am and you don’t know when you or the folks you are working with will get to sleep again. It happens.

I think in order to get the most value out of this book though, the reader should be familiar with a business that makes a physical product from start to finish. Many of the comparisons this story makes are trying to bridge making widgets to making code or other resources that may not be as tangible.

I definitely get it now though, but I didn’t at first. A veteran IT worker will pick up on the story faster than a new person will, but I think a new IT worker will still get good advice from the story even if they may not understand everything in it.

One other helpful aspect I think this book gives insight into is the upper levels of a company. Many people like myself have been working in a job and spend their days chasing down problems or working on a new shiny feature or software program. But, how often do we know what decisions had to be made to get us to the point where we are at now?

I think that is one aspect I really enjoyed in the story is the battles and other things going on behind the scenes that few people rarely get to see or discuss. Plus it was great for both hero and villain development constantly throughout the story.

The Phoenix Project also introduces the reader to concepts of DevOps in order to understand how efforts to improve and streamline manufacturing processes can be translated to the IT world of today. No matter what kind of company you work for, I think the principles for this could be implemented in some form or fashion to help improve processes in IT.

It’s also one of those few books that is helpful to take out once in a while to reread and make sure that you are still following on the right path if you decide to follow the steps to try and implement DevOps in an organization.

Check it out over at the link here and pick up a copy. I think for me it has been well worth a re-read.

If you get a chance to read it yourself, let me know what you thought of it in the comments below and I look forward to seeing what others thought of it as well.

Thanks! -BJ

Why you should disable Nintex Automatic Database Mapping

There are a few things that after I learn them, make me want to immediately share with as many people as possible. This is definitely one of those moments.

After starting a migration project where I work involving some cleanup with Nintex workflow content databases, I’ve run into something that everyone should check in their SharePoint environments right away.

It’s a setting under Central Administration for Nintex and the setting I want to warm everyone about is Automatic Database Mapping.

When you set up Nintex as a solution in SharePoint the install is pretty straight forward and not all that difficult in getting set up and going quickly.

One part of getting Nintex going after you install it is creating a config and content database to start using it to store information about the workflow history and config of the service itself in SQL.

Its at this point when you start that I recommend that you go in and turn OFF automatic database mapping.

Why am I urging you to do this so intently?

Well let’s just say I am working on a migration now with an environment that’s been using Nintex for about 3 years now, with a big config database (around 100GB or so) and multiple Nintex content databases with automatic database mapping enabled… oh and in this farm, there are about 30-40 site collections using Nintex actively.

Now at this point my main goal with this migration is to consolidate the larger Nintex sites into their own content databases.

But with automatic mapping enabled, this is going to make my job a lot harder to sort out.

Now I hope you see why this is important to do when starting at the beginning of a setup. 🙂

Here are the steps to make this change in your SharePoint setup.

Open Central Admin, and go to the Nintex section and look into the Database Management section:

Once in this section, go to the Manage area of databases down at the bottom of this section:

From here you should see the following option where you can make the change to disable automatic mapping here:

Here’s the spot!

Once this is done and you click okay, then you will next need to map your existing SharePoint content databases to your existing (or even better newly created and segregated) Nintex content databases.

Just make sure that as you do this to make sure there is no active work going on for the sites you are working with. Also it wouldn’t be a bad idea to do an IISRESET as well to make sure the settings you changed take effect right away.

Trust me, doing this when you start will save you alot more time and effort than having to take a bunch of big existing databases and map them to newly created blank databases to logically separate the content out.

I hope this helps with anyone starting out with a new Nintex install or if you have to go down the cleanup road like I am as well.

Thanks! -BJ

Migrating Nintex Workflow Content to Another DB

Here lately I’ve been looking into doing some cleanup for Nintex related databases in a SharePoint farm, so I thought I would share a short write up I did for a few other folks on the process of moving content in the context of dealing with Nintex content databases.

Taken from : https://community.nintex.com/docs/DOC-1092

First take backups of Nintex and content databases involved

Go to central admin under the Nintex workflow section and pick the databases option.

From here scroll down and check your DB mappings to make sure you know which DB’s will be involved in these changes

Next go back to the Nintex Databases section and create a new blank Nintex DB to use for migration (name ie. Nintex_Content_SiteName)

If you are moving SharePoint content as well, go into Central admin under Manage Content Databases and create a new one there, or you can use PowerShell to create the new SharePoint Content DB and attach it to the farm.

The SharePoint content can be migrated by normal backup-spsite and restore-spsite methods specifying the new Content DB created to make sure the SharePoint info goes to the new DB.

Or this can be done using the Move-SPSite command through PowerShell as well with an IIS RESET as the last step.

Now to migrate the Nintex Workflow info you will need to do a few more steps:

  1. Stop the web app that contains the Nintex info being moved
  2. Also stop the SharePoint Timer Service on ALL SharePoint servers in the farm so no actions take place in the background during the move.
  3. Run the nwadmin -o movedata command to migrate the content to the new Nintex DB
    1. Example like: nwadmin -o moveData -Url http://webapplication.domain.com/sites/sitename
  4. Once this command executes you may see errors or other info about the moved workflows. If there are failures you may want to choose the option to roll back the changes.
  5. Once the command finishes successfully, restart the SharePoint Timer Service and the web application in IIS in order to get everything working correctly again.
  6. Recheck database mappings in central admin to make sure the items are in the newly created database
  7. May also want to run NWAdmin.exe -o CleanTaskRedirects [-test]
  8. Specify the old nintex DB to see if there are any leftover workflows for lazy approvals.
  9. If not, then remove [-test] from the previous command and it should remove any other info to clear out the old Nintex DB

Again, check your mappings in Central Admin to make sure everything is now separated as it should be and good to go

I hope this helps for anyone that has to go through this process in the future. I’m still learning alot about it myself, so once I’ve had plenty of practice, I may post an updated article as well.

Thanks everyone! -BJ