Book Review – The Phoenix Project

I recently finished this book since I started a new job back in June and it has had a big impact on my thought processes since then.

Before I delve much further into the book itself, I think this book should be required reading for anyone getting into the IT field. It definitely gives you some real to life (at least for me) examples and scenarios of what to expect and in some cases, how to overcome them. I cannot recommend this book enough just on the examples given of how projects are run and maintained in IT and in many businesses today.

For me personally, I definitely identified with many of the characters in the story as I have now been working in IT in some form or fashion since 2006. Some days are better when you can be leading a project to completion and getting high fives all around the office. Then, there are some days where its 3am and you don’t know when you or the folks you are working with will get to sleep again. It happens.

I think in order to get the most value out of this book though, the reader should be familiar with a business that makes a physical product from start to finish. Many of the comparisons this story makes are trying to bridge making widgets to making code or other resources that may not be as tangible.

I definitely get it now though, but I didn’t at first. A veteran IT worker will pick up on the story faster than a new person will, but I think a new IT worker will still get good advice from the story even if they may not understand everything in it.

One other helpful aspect I think this book gives insight into is the upper levels of a company. Many people like myself have been working in a job and spend their days chasing down problems or working on a new shiny feature or software program. But, how often do we know what decisions had to be made to get us to the point where we are at now?

I think that is one aspect I really enjoyed in the story is the battles and other things going on behind the scenes that few people rarely get to see or discuss. Plus it was great for both hero and villain development constantly throughout the story.

The Phoenix Project also introduces the reader to concepts of DevOps in order to understand how efforts to improve and streamline manufacturing processes can be translated to the IT world of today. No matter what kind of company you work for, I think the principles for this could be implemented in some form or fashion to help improve processes in IT.

It’s also one of those few books that is helpful to take out once in a while to reread and make sure that you are still following on the right path if you decide to follow the steps to try and implement DevOps in an organization.

Check it out over at the link here and pick up a copy. I think for me it has been well worth a re-read.

If you get a chance to read it yourself, let me know what you thought of it in the comments below and I look forward to seeing what others thought of it as well.

Thanks! -BJ

Advertisements

Why you should disable Nintex Automatic Database Mapping

There are a few things that after I learn them, make me want to immediately share with as many people as possible. This is definitely one of those moments.

After starting a migration project where I work involving some cleanup with Nintex workflow content databases, I’ve run into something that everyone should check in their SharePoint environments right away.

It’s a setting under Central Administration for Nintex and the setting I want to warm everyone about is Automatic Database Mapping.

When you set up Nintex as a solution in SharePoint the install is pretty straight forward and not all that difficult in getting set up and going quickly.

One part of getting Nintex going after you install it is creating a config and content database to start using it to store information about the workflow history and config of the service itself in SQL.

Its at this point when you start that I recommend that you go in and turn OFF automatic database mapping.

Why am I urging you to do this so intently?

Well let’s just say I am working on a migration now with an environment that’s been using Nintex for about 3 years now, with a big config database (around 100GB or so) and multiple Nintex content databases with automatic database mapping enabled… oh and in this farm, there are about 30-40 site collections using Nintex actively.

Now at this point my main goal with this migration is to consolidate the larger Nintex sites into their own content databases.

But with automatic mapping enabled, this is going to make my job a lot harder to sort out.

Now I hope you see why this is important to do when starting at the beginning of a setup. 🙂

Here are the steps to make this change in your SharePoint setup.

Open Central Admin, and go to the Nintex section and look into the Database Management section:

Once in this section, go to the Manage area of databases down at the bottom of this section:

From here you should see the following option where you can make the change to disable automatic mapping here:

Here’s the spot!

Once this is done and you click okay, then you will next need to map your existing SharePoint content databases to your existing (or even better newly created and segregated) Nintex content databases.

Just make sure that as you do this to make sure there is no active work going on for the sites you are working with. Also it wouldn’t be a bad idea to do an IISRESET as well to make sure the settings you changed take effect right away.

Trust me, doing this when you start will save you alot more time and effort than having to take a bunch of big existing databases and map them to newly created blank databases to logically separate the content out.

I hope this helps with anyone starting out with a new Nintex install or if you have to go down the cleanup road like I am as well.

Thanks! -BJ

Migrating Nintex Workflow Content to Another DB

Here lately I’ve been looking into doing some cleanup for Nintex related databases in a SharePoint farm, so I thought I would share a short write up I did for a few other folks on the process of moving content in the context of dealing with Nintex content databases.

Taken from : https://community.nintex.com/docs/DOC-1092

First take backups of Nintex and content databases involved

Go to central admin under the Nintex workflow section and pick the databases option.

From here scroll down and check your DB mappings to make sure you know which DB’s will be involved in these changes

Next go back to the Nintex Databases section and create a new blank Nintex DB to use for migration (name ie. Nintex_Content_SiteName)

If you are moving SharePoint content as well, go into Central admin under Manage Content Databases and create a new one there, or you can use PowerShell to create the new SharePoint Content DB and attach it to the farm.

The SharePoint content can be migrated by normal backup-spsite and restore-spsite methods specifying the new Content DB created to make sure the SharePoint info goes to the new DB.

Or this can be done using the Move-SPSite command through PowerShell as well with an IIS RESET as the last step.

Now to migrate the Nintex Workflow info you will need to do a few more steps:

  1. Stop the web app that contains the Nintex info being moved
  2. Also stop the SharePoint Timer Service on ALL SharePoint servers in the farm so no actions take place in the background during the move.
  3. Run the nwadmin -o movedata command to migrate the content to the new Nintex DB
    1. Example like: nwadmin -o moveData -Url http://webapplication.domain.com/sites/sitename
  4. Once this command executes you may see errors or other info about the moved workflows. If there are failures you may want to choose the option to roll back the changes.
  5. Once the command finishes successfully, restart the SharePoint Timer Service and the web application in IIS in order to get everything working correctly again.
  6. Recheck database mappings in central admin to make sure the items are in the newly created database
  7. May also want to run NWAdmin.exe -o CleanTaskRedirects [-test]
  8. Specify the old nintex DB to see if there are any leftover workflows for lazy approvals.
  9. If not, then remove [-test] from the previous command and it should remove any other info to clear out the old Nintex DB

Again, check your mappings in Central Admin to make sure everything is now separated as it should be and good to go

I hope this helps for anyone that has to go through this process in the future. I’m still learning alot about it myself, so once I’ve had plenty of practice, I may post an updated article as well.

Thanks everyone! -BJ

PowerShell Saturday Chattanooga 2018

This weekend I attended my first ever PowerShell Saturday (and Pre-Conference) event put on by some great folks here in Chattanooga on August 10th and 11th.

The Friday pre-con event was given by none other than the famous Jeff Hicks Blog Twitter

Loved all the scripts Jeff showed during his pre-con

Starting off the day Friday, Jeff showed us some great scripts that I will try to go back and post into my Git Hub area as well.

In the past I’ve always tried one liners and other blogs that linked to scripts in order to figure out a PowerShell solution, but Jeff did a great job on introducing a familiar layout and format on how to write scripts in a well thought out and formatted way so they could be easily read and digested by anyone else that comes across them in the future.

I’ve really took this to heart as a personal challenge and will try doing this myself in the future since I’ve been looking into writing more scripts lately.

The entire day was packed with some much good material, but that was just the first day. Tomorrow is the Saturday conference.

Scraping the bleeding edge of tech here…lol

The Saturday conference started with a great talk about Writing award winning PowerShell from Mike F Robbins. He gave a great talk with many tips on how to take your scripts to the next level. Mainly around focusing on writing functions going forward and not scripts necessarily. His blog has tons of great helpful tips and can be found here.

Next session was on Taking control of profile scripts by Tim Warner. Tim is a great guy and very helpful through his online videos at PluralSight and many other articles that he has posted over at https://techtrainertim.com/.

He went into how you can use profiles in PowerShell to accomplish different tasks and how to modify and change them to help.

My next session after this was Troubleshooting with PowerShell by Jonathan Warnken. He had a great session and brought up some great ideas that I could use to possibly create some good helpful scripts in my day to day work. You can find him on Twitter @MrBoDean

The next session was back with Tim Warner again and was all about getting started with VSCode and using it for PowerShell development. Jeff Hicks covered this topic a bit on Friday, but Tim, went into deeper details and really showed some cool items on how to start getting your hands dirty when creating new scripts in VSCode.

My last session of the day was with Jeff again peering Under Cover with PowerShell and more tips on producing better quality scripts going forward. Some of these related to write-progress and default parameter values in scripts along with using verbose output in scripts to help errors and using the Trace-Command (?) option to improve on things as well.

I have to say this was a great event and I want to thank everyone involved with putting it on. The PowerShell user group in Chattanooga most of all because without them, we could not have had this event at all. If you want to know more and learn more, find them on https://twitter.com/chatpsug

Handling large lists with SharePoint on premises

If you have spent much time working with SharePoint, you know that once a group of users likes it to store information in libraries or Lists, you can certainly tell.

Ultimately what happens is you get an email (or call, or service ticket) one day that says “Hey we are getting some kind of error that says we have exceeded the list view threshold…”. No problem, you think, I’ll just go into central admin and raise the threshold above 5000 and that should fix that.

Well, it does fix the issue, but only until that threshold gets crossed again a week/month/year later.

And the story repeats over and over again until everything is slow and everyone hates SharePoint because its so slow… am I right???

What I’d like to recommend today is a method I have used many times before that solves this problem and hopefully teachers your users a more efficient way to store their content in SharePoint.

For this example, lets say I have a list with 30,000+ items in it.

I see after looking at the content in this list that we have about 5-6 year’s worth of information in this list.

And that there is the kicker. The solution I want to recommend in this case is that you take your content from this list of too many items and break it into several smaller lists by year.

Now this structure is much more manageable and it also allows you as an admin to go into central admin and lower that threshold to something that wont cause SharePoint to move as slow as the day at work before you get ready to go on vacation (Anyone ever experienced that as well?).

Probably the easiest way to accomplish this would be to use a third party tool that your admin group might already have. There are bunches of them and you can read more about it here:

https://collab365.community/sharepoint-comparison-matrix-for-3rd-party-migration-tools/

If you are more developery and want to try and tackle the task using PowerShell, you can have at it, but your mileage may vary. Especially when dealing with item level permissions, notifications, and workflows (beware and double check for these).

Also you can do things like set indexes on columns and other strategies like Metadata and such that you can do to also help with scenarios like this, but those are topics that could be their own posts in the future.

So going forward, please I beg you don’t just increase that view threshold and leave it, you will eventually have to come back and deal with the consequences. It brings to mind something about an ounce of prevention equals a pound of cure or something along those lines I’ve heard before…

Thanks for reading and I hope this helps prevent any future pains in SharePoint!

AutoLab to create quick test labs

One of the best way to learn how to do things in the field of technology is to actually get your hands dirty and practice doing something. That’s one method I have followed for years and has helped me learn most of what I know today.

One way that helps do this is with creating virtual machines through software like Hyper-V or VMware or another program to help build and run things locally on your own machine. I’ve been doing this for a while, but it’s always a struggle I find to quickly spin up machines with certain things enabled and configured a certain way.

I’ve tried different methods in the past to help streamline things like this, but the other day I was made aware of a project on GitHub that really looks like it could be a great help for people who need things built a certain way quickly.

The project is called AutoLab and was created by Jason Helmick b|t a great trainer who has made some fantastic online courses through Pluralsight and other venues as well.

I’ve watched several of his PowerShell training videos in the past to help learn new things, but I think there is some real value here in AutoLab. It only works in Hyper-V but it even helps enable it on a Windows box as part of the installation process.

The basis of the project is another GitHub project called Lability.

The features of this project combine PowerShell Desired State Configuration (DSC) to use config files and other functions to quickly build up a lab, refresh a lab, take a snapshot of multiple VM’s at once, and even when you are done then tear down the entire lab in one command.

HOW AWESOME IS THAT!!! Very awesome indeed!

Right now there are many configurations that people have added to in order to setup different scenarios.

I think there’s some great potential here for this to quickly build up test labs when you need to learn a particular skill or subset of features on a new product.

If you get a chance, check it out and try it out and give feedback to the creators.

I’m going to do my part to help this take off because it’s a great idea!

Thanks and enjoy!

-BJ

 

New fentressdev.org blog domain

Ever since I started blogging over 6 years ago, I have always operated under the free site model, but that changes as of today.

Officially my blog has now changed url to bjfentress.fentressdev.org

I plan to keep posting the same kind of content here, but from now for the foreseeable future everything will be created under this new domain for my blog.

One of the big reasons I changed to this is so I could use google analytics. I think it’s vital for me as a content creator to know more about the people who visit my blog so I can tailor content for that audience to deliver more value going forward.

Plus if my posts aren’t getting any views, analytics will definitely let me know whether or not to write on topics deeper in one area or maybe change gears and switch to another topic that could benefit others more going forward.

I hope this change doesn’t throw too many things off kilter and that people can get to my content just as easy as they did before. If you have any thoughts about the change, feel free to share them with me in the comments below!

Thanks!

-BJ

SharePoint resources for demos

Today’s post I thought I would include in order to find anything needed to create a working demo of SharePoint locally. Nowadays,  Azure or other cloud providers make this much easier, but if you want more control you can spin up your own locally.

Windows Server

2019 (still beta as of now)

https://techcommunity.microsoft.com/t5/Windows-Server-Insiders/bd-p/WindowsServerInsiders

2016

https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-2016

2012R2

https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-2012-r2

 

SQL Server

2017

https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2017-RTM

2016

https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2016

2014

https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2014-sp2

 

SharePoint 2016

https://www.microsoft.com/en-us/download/details.aspx?id=51493

 

SharePoint 2013

Foundations

https://www.microsoft.com/en-us/download/details.aspx?id=35488

Server (Standard and Enterprise)

https://www.microsoft.com/en-us/evalcenter/evaluate-sharepoint-server-2013

 

SharePoint 2010

Foundations

https://www.microsoft.com/en-us/download/details.aspx?id=24983

Server (Standard and Enterprise)

https://www.microsoft.com/en-us/download/details.aspx?id=16631

 

Enjoy and happy lab building!

Office Files too big? Inspect them!

Today I had a problem with a group of users that were trying to work with a very large PowerPoint file in SharePoint and didn’t realize until I downloaded a local copy of the file that it was too big to work with through Office Web Apps.

So I did some digging around to see if there was a way to possibly shrink the files, so I ended up finding something here:

https://support.office.com/en-us/article/remove-hidden-data-and-personal-information-by-inspecting-documents-presentations-or-workbooks-356b7b5d-77af-44fe-a07f-9aa4d085966f

In PowerPoint 2016 you can find the option here:

After doing a bit more digging, this actually applies to all Word, Excel, and PowerPoint versions from 2016 on down to even the Office 2010 versions.

So now I’d be curious if this could be automated somehow to do some major cleanup in PowerShell or something? Hmmm… maybe for another blog post later… hehehe.

Hope this helps if you need to free up some space in the future.

Enjoy!

GitHub for Sharing

Recently I decided to jump into GitHub to share some of the scripts that I have used and created for my own benefit in order to see if this helps out any regular visitors of my blog as well.

You can check it out over at https://github.com/fentressbj

Going forward I will try to capture all of the relevant scripts that I have used over the years for configuration for building out different setups.

Of course if I use a script from another author I will absolutely try to provide credit. I think its dishonest not to do that if its not something you have written from scratch. If you take one and improve it or modify it, I think the original author should be given credit still. These days its extremely hard to be a content creator given some attitudes to share things without giving credit.

I hope this helps others and everyone enjoys my dive into learning more and more about GitHub.

-BJ