Compare 2 similar Excel files with the Inquire Add-In

Note: this was only tested in Excel 2019. Office 365 Pro Plus has something called Compare sheets similar to this function

I had the need the other day to compare two excel files that had some information in them from exported SharePoint lists. They were very similar, but I didn’t know of a good way to compare each of them to the other since there were about 200-ish rows in each list, so sorting through them would take a while.

That is until I ran across the compare functionality in Excel that could be turned on my making the Inquire add-in visible.

After some searching I came across this:

https://support.office.com/en-us/article/turn-on-the-inquire-add-in-6bc668e2-f3c6-4729-8ce1-75ea20aa9d90

First I had to go here to options:
After that, I had to go to Add-Ins and from the Manage drop down, select COM Add-ins and click Go…:
Then I was able to check the box on Inquire and make the tab show up now in Excel:
Once enabled you should see this tab with the following options:
From here you can select which files to compare:
After selecting the Compare button, you may get the option to save both files and once it runs your output should look something like this:

In my case I was comparing url’s so each time I clicked on one url on the left hand, if it existed in the other worksheet, it would be highlighted on the right hand side.

Very handy if you have to compare similar info from Excel sheets. It saved me some time, and I hope it does the same for you.

Enjoy! -BJ

Advertisements

Year in Review 2018

Reflecting back on the previous year was again a year of big changes for me. From a professional standpoint I changed jobs mid year in 2018 and I also took on a new role working from home. This has been a huge change for me and has taken some time to get comfortable with and I have a great new appreciation for those folks doing it and who have been able to do it for an extended period.

From a travelling standpoint, I didn’t do much of it other than flying a couple of times to the Midwest for the new company I work for. So again the new job and new schedule had a lot to do with my lack of travel this year.

On the community side of things I was able to attend PowerShell Saturday Chattanooga for the first time. If you want to read more about that you can read about it here.

I also attended in May the events in Nashville for Cloud Friday and SharePoint Saturday Nashville. Both of these events were excellently done. Some of the best conference food I have ever had was here. I wanted to do a big shout out to @JimBobHoward for putting on a couple of great events! I really enjoyed both and learned more from both events as well!

If you have a chance to attend either one of these events, I highly recommend it.

This year was also the first time I went through my first Microsoft 365 migration. I am currently still learning a lot about 365 and what really blows me away if how many services are involved in the total offering.

Trying to keep up on all the latest updates with this many services is already a struggle, but it’s exciting at the same time for the pace of changes and how often the service is updated.

Finally this year I dipped my toe into the GitHub waters. I created a repository online that I will gradually start posting my awfully written scripts and other helpful info to as well.

Keep an eye out there and I will try to mention those by twitter and other places as I add new content to it.

Thanks again to everyone for helping grow my knowledge in the community and I hope my posts here have helped increase others knowledge as well.

Hope everyone has a happy and healthy 2019!

-BJ

Book Review – The Phoenix Project

I recently finished this book since I started a new job back in June and it has had a big impact on my thought processes since then.

Before I delve much further into the book itself, I think this book should be required reading for anyone getting into the IT field. It definitely gives you some real to life (at least for me) examples and scenarios of what to expect and in some cases, how to overcome them. I cannot recommend this book enough just on the examples given of how projects are run and maintained in IT and in many businesses today.

For me personally, I definitely identified with many of the characters in the story as I have now been working in IT in some form or fashion since 2006. Some days are better when you can be leading a project to completion and getting high fives all around the office. Then, there are some days where its 3am and you don’t know when you or the folks you are working with will get to sleep again. It happens.

I think in order to get the most value out of this book though, the reader should be familiar with a business that makes a physical product from start to finish. Many of the comparisons this story makes are trying to bridge making widgets to making code or other resources that may not be as tangible.

I definitely get it now though, but I didn’t at first. A veteran IT worker will pick up on the story faster than a new person will, but I think a new IT worker will still get good advice from the story even if they may not understand everything in it.

One other helpful aspect I think this book gives insight into is the upper levels of a company. Many people like myself have been working in a job and spend their days chasing down problems or working on a new shiny feature or software program. But, how often do we know what decisions had to be made to get us to the point where we are at now?

I think that is one aspect I really enjoyed in the story is the battles and other things going on behind the scenes that few people rarely get to see or discuss. Plus it was great for both hero and villain development constantly throughout the story.

The Phoenix Project also introduces the reader to concepts of DevOps in order to understand how efforts to improve and streamline manufacturing processes can be translated to the IT world of today. No matter what kind of company you work for, I think the principles for this could be implemented in some form or fashion to help improve processes in IT.

It’s also one of those few books that is helpful to take out once in a while to reread and make sure that you are still following on the right path if you decide to follow the steps to try and implement DevOps in an organization.

Check it out over at the link here and pick up a copy. I think for me it has been well worth a re-read.

If you get a chance to read it yourself, let me know what you thought of it in the comments below and I look forward to seeing what others thought of it as well.

Thanks! -BJ

Why you should disable Nintex Automatic Database Mapping

There are a few things that after I learn them, make me want to immediately share with as many people as possible. This is definitely one of those moments.

After starting a migration project where I work involving some cleanup with Nintex workflow content databases, I’ve run into something that everyone should check in their SharePoint environments right away.

It’s a setting under Central Administration for Nintex and the setting I want to warm everyone about is Automatic Database Mapping.

When you set up Nintex as a solution in SharePoint the install is pretty straight forward and not all that difficult in getting set up and going quickly.

One part of getting Nintex going after you install it is creating a config and content database to start using it to store information about the workflow history and config of the service itself in SQL.

Its at this point when you start that I recommend that you go in and turn OFF automatic database mapping.

Why am I urging you to do this so intently?

Well let’s just say I am working on a migration now with an environment that’s been using Nintex for about 3 years now, with a big config database (around 100GB or so) and multiple Nintex content databases with automatic database mapping enabled… oh and in this farm, there are about 30-40 site collections using Nintex actively.

Now at this point my main goal with this migration is to consolidate the larger Nintex sites into their own content databases.

But with automatic mapping enabled, this is going to make my job a lot harder to sort out.

Now I hope you see why this is important to do when starting at the beginning of a setup. 🙂

Here are the steps to make this change in your SharePoint setup.

Open Central Admin, and go to the Nintex section and look into the Database Management section:

Once in this section, go to the Manage area of databases down at the bottom of this section:

From here you should see the following option where you can make the change to disable automatic mapping here:

Here’s the spot!

Once this is done and you click okay, then you will next need to map your existing SharePoint content databases to your existing (or even better newly created and segregated) Nintex content databases.

Just make sure that as you do this to make sure there is no active work going on for the sites you are working with. Also it wouldn’t be a bad idea to do an IISRESET as well to make sure the settings you changed take effect right away.

Trust me, doing this when you start will save you alot more time and effort than having to take a bunch of big existing databases and map them to newly created blank databases to logically separate the content out.

I hope this helps with anyone starting out with a new Nintex install or if you have to go down the cleanup road like I am as well.

Thanks! -BJ

Migrating Nintex Workflow Content to Another DB

Here lately I’ve been looking into doing some cleanup for Nintex related databases in a SharePoint farm, so I thought I would share a short write up I did for a few other folks on the process of moving content in the context of dealing with Nintex content databases.

Taken from : https://community.nintex.com/docs/DOC-1092

First take backups of Nintex and content databases involved

Go to central admin under the Nintex workflow section and pick the databases option.

From here scroll down and check your DB mappings to make sure you know which DB’s will be involved in these changes

Next go back to the Nintex Databases section and create a new blank Nintex DB to use for migration (name ie. Nintex_Content_SiteName)

If you are moving SharePoint content as well, go into Central admin under Manage Content Databases and create a new one there, or you can use PowerShell to create the new SharePoint Content DB and attach it to the farm.

The SharePoint content can be migrated by normal backup-spsite and restore-spsite methods specifying the new Content DB created to make sure the SharePoint info goes to the new DB.

Or this can be done using the Move-SPSite command through PowerShell as well with an IIS RESET as the last step.

Now to migrate the Nintex Workflow info you will need to do a few more steps:

  1. Stop the web app that contains the Nintex info being moved
  2. Also stop the SharePoint Timer Service on ALL SharePoint servers in the farm so no actions take place in the background during the move.
  3. Run the nwadmin -o movedata command to migrate the content to the new Nintex DB
    1. Example like: nwadmin -o moveData -Url http://webapplication.domain.com/sites/sitename
  4. Once this command executes you may see errors or other info about the moved workflows. If there are failures you may want to choose the option to roll back the changes.
  5. Once the command finishes successfully, restart the SharePoint Timer Service and the web application in IIS in order to get everything working correctly again.
  6. Recheck database mappings in central admin to make sure the items are in the newly created database
  7. May also want to run NWAdmin.exe -o CleanTaskRedirects [-test]
  8. Specify the old nintex DB to see if there are any leftover workflows for lazy approvals.
  9. If not, then remove [-test] from the previous command and it should remove any other info to clear out the old Nintex DB

Again, check your mappings in Central Admin to make sure everything is now separated as it should be and good to go

I hope this helps for anyone that has to go through this process in the future. I’m still learning alot about it myself, so once I’ve had plenty of practice, I may post an updated article as well.

Thanks everyone! -BJ

PowerShell Saturday Chattanooga 2018

This weekend I attended my first ever PowerShell Saturday (and Pre-Conference) event put on by some great folks here in Chattanooga on August 10th and 11th.

The Friday pre-con event was given by none other than the famous Jeff Hicks Blog Twitter

Loved all the scripts Jeff showed during his pre-con

Starting off the day Friday, Jeff showed us some great scripts that I will try to go back and post into my Git Hub area as well.

In the past I’ve always tried one liners and other blogs that linked to scripts in order to figure out a PowerShell solution, but Jeff did a great job on introducing a familiar layout and format on how to write scripts in a well thought out and formatted way so they could be easily read and digested by anyone else that comes across them in the future.

I’ve really took this to heart as a personal challenge and will try doing this myself in the future since I’ve been looking into writing more scripts lately.

The entire day was packed with some much good material, but that was just the first day. Tomorrow is the Saturday conference.

Scraping the bleeding edge of tech here…lol

The Saturday conference started with a great talk about Writing award winning PowerShell from Mike F Robbins. He gave a great talk with many tips on how to take your scripts to the next level. Mainly around focusing on writing functions going forward and not scripts necessarily. His blog has tons of great helpful tips and can be found here.

Next session was on Taking control of profile scripts by Tim Warner. Tim is a great guy and very helpful through his online videos at PluralSight and many other articles that he has posted over at https://techtrainertim.com/.

He went into how you can use profiles in PowerShell to accomplish different tasks and how to modify and change them to help.

My next session after this was Troubleshooting with PowerShell by Jonathan Warnken. He had a great session and brought up some great ideas that I could use to possibly create some good helpful scripts in my day to day work. You can find him on Twitter @MrBoDean

The next session was back with Tim Warner again and was all about getting started with VSCode and using it for PowerShell development. Jeff Hicks covered this topic a bit on Friday, but Tim, went into deeper details and really showed some cool items on how to start getting your hands dirty when creating new scripts in VSCode.

My last session of the day was with Jeff again peering Under Cover with PowerShell and more tips on producing better quality scripts going forward. Some of these related to write-progress and default parameter values in scripts along with using verbose output in scripts to help errors and using the Trace-Command (?) option to improve on things as well.

I have to say this was a great event and I want to thank everyone involved with putting it on. The PowerShell user group in Chattanooga most of all because without them, we could not have had this event at all. If you want to know more and learn more, find them on https://twitter.com/chatpsug

Handling large lists with SharePoint on premises

If you have spent much time working with SharePoint, you know that once a group of users likes it to store information in libraries or Lists, you can certainly tell.

Ultimately what happens is you get an email (or call, or service ticket) one day that says “Hey we are getting some kind of error that says we have exceeded the list view threshold…”. No problem, you think, I’ll just go into central admin and raise the threshold above 5000 and that should fix that.

Well, it does fix the issue, but only until that threshold gets crossed again a week/month/year later.

And the story repeats over and over again until everything is slow and everyone hates SharePoint because its so slow… am I right???

What I’d like to recommend today is a method I have used many times before that solves this problem and hopefully teachers your users a more efficient way to store their content in SharePoint.

For this example, lets say I have a list with 30,000+ items in it.

I see after looking at the content in this list that we have about 5-6 year’s worth of information in this list.

And that there is the kicker. The solution I want to recommend in this case is that you take your content from this list of too many items and break it into several smaller lists by year.

Now this structure is much more manageable and it also allows you as an admin to go into central admin and lower that threshold to something that wont cause SharePoint to move as slow as the day at work before you get ready to go on vacation (Anyone ever experienced that as well?).

Probably the easiest way to accomplish this would be to use a third party tool that your admin group might already have. There are bunches of them and you can read more about it here:

https://collab365.community/sharepoint-comparison-matrix-for-3rd-party-migration-tools/

If you are more developery and want to try and tackle the task using PowerShell, you can have at it, but your mileage may vary. Especially when dealing with item level permissions, notifications, and workflows (beware and double check for these).

Also you can do things like set indexes on columns and other strategies like Metadata and such that you can do to also help with scenarios like this, but those are topics that could be their own posts in the future.

So going forward, please I beg you don’t just increase that view threshold and leave it, you will eventually have to come back and deal with the consequences. It brings to mind something about an ounce of prevention equals a pound of cure or something along those lines I’ve heard before…

Thanks for reading and I hope this helps prevent any future pains in SharePoint!

AutoLab to create quick test labs

One of the best way to learn how to do things in the field of technology is to actually get your hands dirty and practice doing something. That’s one method I have followed for years and has helped me learn most of what I know today.

One way that helps do this is with creating virtual machines through software like Hyper-V or VMware or another program to help build and run things locally on your own machine. I’ve been doing this for a while, but it’s always a struggle I find to quickly spin up machines with certain things enabled and configured a certain way.

I’ve tried different methods in the past to help streamline things like this, but the other day I was made aware of a project on GitHub that really looks like it could be a great help for people who need things built a certain way quickly.

The project is called AutoLab and was created by Jason Helmick b|t a great trainer who has made some fantastic online courses through Pluralsight and other venues as well.

I’ve watched several of his PowerShell training videos in the past to help learn new things, but I think there is some real value here in AutoLab. It only works in Hyper-V but it even helps enable it on a Windows box as part of the installation process.

The basis of the project is another GitHub project called Lability.

The features of this project combine PowerShell Desired State Configuration (DSC) to use config files and other functions to quickly build up a lab, refresh a lab, take a snapshot of multiple VM’s at once, and even when you are done then tear down the entire lab in one command.

HOW AWESOME IS THAT!!! Very awesome indeed!

Right now there are many configurations that people have added to in order to setup different scenarios.

I think there’s some great potential here for this to quickly build up test labs when you need to learn a particular skill or subset of features on a new product.

If you get a chance, check it out and try it out and give feedback to the creators.

I’m going to do my part to help this take off because it’s a great idea!

Thanks and enjoy!

-BJ

 

Cloud Friday and SharePoint Saturday Nashville 2018

Last month I attended two events held in Nashville, TN. One on Friday called Cloud Friday Nashville and the next on Saturday called SharePoint Saturday.

Both events were very well done and helped me gain some good knowledge on the changing topics of SharePoint and Azure.

By far some of the best conference food I have ever had before. I would come back again next year just for that… 🙂

The Cloud Friday event delved into various topics related to Azure and I was able to get familiar with some good introductions and some deep dives into a few of the sessions I had not had the chance to review before.

Check them out online @cloudfridaynash and keep an eye out for the event next year! Below are some pics from the events.

SharePoint resources for demos

Today’s post I thought I would include in order to find anything needed to create a working demo of SharePoint locally. Nowadays,  Azure or other cloud providers make this much easier, but if you want more control you can spin up your own locally.

Windows Server

2019 (still beta as of now)

https://techcommunity.microsoft.com/t5/Windows-Server-Insiders/bd-p/WindowsServerInsiders

2016

https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-2016

2012R2

https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-2012-r2

 

SQL Server

2017

https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2017-RTM

2016

https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2016

2014

https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2014-sp2

 

SharePoint 2016

https://www.microsoft.com/en-us/download/details.aspx?id=51493

 

SharePoint 2013

Foundations

https://www.microsoft.com/en-us/download/details.aspx?id=35488

Server (Standard and Enterprise)

https://www.microsoft.com/en-us/evalcenter/evaluate-sharepoint-server-2013

 

SharePoint 2010

Foundations

https://www.microsoft.com/en-us/download/details.aspx?id=24983

Server (Standard and Enterprise)

https://www.microsoft.com/en-us/download/details.aspx?id=16631

 

Enjoy and happy lab building!