Reflection: A Year of Moodle Admin & a Big Project

Note: I wanted to write up a piece that reflected on a year of Moodle admin, talked about how I tackled my biggest upgrade project so far, and also laid out my current approach to performing upgrades. Obviously, that is a lot and turned into too much for a post. So here is part 1 with some reflections. I’ll lay out my upgrade process in a shorter, pt. 2 post, coming (hopefully) soon.

I’m approaching a year of Moodle sys-admin-life and wanted to take a moment to reflect. I remember the uncertainty (sheer terror?) I felt just starting out: I had a lot to learn. Getting used to Linux CLI, Git, working with webservers and MySQL databases were all pretty new. I remember struggling through the first minor point upgrade I ran on my institution’s production Moodle site. I scheduled downtime for the evening hours and worked into the late night. A year later, this process takes me just a few minutes…there was definitely some newbie confusion and just a need to blindly go in and figure things out on my part that led to things taking way longer than they should have (especially figuring out those Git submodules). Yup, I was sweating the details….but I also remember the satisfaction and relief I felt, and the rush of seeing the update script complete and my upgraded site loading for the first time.

Throughout the last year, I kept practicing. I set up sandbox environments to get more experience on test sites (thanks AWS). I ran two more minor updates on production and became more confident each time. I got to experience installing, removing, and updating plugins, and worked with a contract developer to update a custom plugin. I even built out a brand new production Moodle site in the cloud, using Moodle with the Snap theme for a MOOC flavored side-project.

This is all to say that this Summer when it came time to plan not a minor-point update, but a major one – 3.4 to 3.6 – I felt not only ready to tackle the job but to make real improvements to my process.

Making a plan

Since this was the biggest Moodle project I had yet taken on I knew it was critical to come into the upgrade with a solid plan. So first I pored over the official Moodle Docs guide to updating. I made sure my server environment would continue to meet all recommendations, and painstakingly went through all extra plugins to check for compatability and available upgrades.

For the main piece of the update – the Moodle code – I initially researched updating in much the same way I had done the smaller point updates. Since I use Git for version control with submodules for plugins, that would simply entail changing the tracked branch from the official Moodle Git repo and doing a git merge. Unfortunately, this was not as easy as it first seemed, and I found that others in the community struggled with this approach. Unlike minor point updates, merging a major update requires resolving many conflicts or differences between your existing code base and the updated code. I quickly decided instead to use a clean copy of the 3.6 code.

So, I simply cloned the MOODLE_36_STABLE branch from the Moodle GitHub page into a new project directory. All I migrated over from my old folder was my config.php file and my .gitmodules file.

That introduced two new challenges:

  1. How to decouple any existing code customization that was tracked in my main Moodle Git repo (and therefore was not present in a fresh version of Moodle core)
  2. How to update plugins and efficiently migrate them as Git submodules to my new project

Decoupling customization

This was pretty straight-forward if somewhat time-consuming. Since Moodle treats basically everything as a plugin, I simply went to the Site Admin -> Plugins -> Plugins overview -> Additional plugins screen and compared that with my .gitmodules file (the file in git that keeps track of all of your project’s submodules). This let me identify any custom items that were being tracked in my main repository.

Last year I got UP signed up with a GitHub for Education plan, so I had unlimited free private repositories. So I just created new repos for the few items that were either homegrown (a custom HTML block) or that vendors for whatever reason do not make available through a public GitHub. The most annoying was Kaltura’s video plugins, which they put on GitHub as a whole repo with multiple plugins in various directories in a way that won’t work as a submodule, meaning I had to make my own repo for each individual included plugin. If anyone has an idea of how to include something like that wholesale as a submodule, I’d love to hear from you in the comments!

Anyhow, once I had moved everything into it’s own Git repo, I just added those to my .gitmodules file in the new project.

Migrating submodules

To make sure I was getting updated versions of my plugins that were compatible with 3.6, I just updated my .gitmodules file as needed. Honestly, most Moodle developers do a great job of providing branches for multiple Moodle versions and use the same naming convention so doing a find an replace for “MOODLE_34_STABLE” to “MOODLE_36_STABLE” took care of 75% of the work. But the remainder of plugins that needed updating either used non-standard branch naming conventions or had changed remote URLs or maintainers, so there was a fair bit of drudgy manual work to get everything pointed to the right place.

The problem of migrating submodules was a bit less obvious but ultimately much easier and more straight-forward. You can’t just copy a .gitmodules file into a new repo – you end up with a bunch of empty plugin folders. But, thankfully if you know the commands to run it only takes a minute to get up and running (once again, submodule are extremely efficient if you know what you are doing and extremely cumbersome and time-consuming if you don’t). Here is what I ran to bring my submodules back to life in my new project folder. First, make sure the .gitmodules file is updated and contains the submodules you want to include, along with the path to install them in, updated branches you want to track and remote URLs. Then, in the main project folder (where .gitmodules is located):

First, initialize submodules listed in the gitmodules file

sudo git submodule init

Then clone the most recent commit of the specified branch into the install location

sudo git submodule update --remote

With that work done, I had a fresh copy of the 3.6 code with ALL local modifications exiting as submodules that pointed to a 3.6 compatible version of a plugin. I ran the update script and….boom! Everything went through and installed fine on the first try.

Photo by Joshua Chun on Unsplash

Migrating a multisite WordPress install to a new server

One project I’m working on this Summer is moving a WordPress multsite install (with about 300 sites) to new hardware. I didn’t find much in the way of documentation or guides on this, especially anything specific to a multisite install. So, to prep for this project, I created a new WP multisite with a few test sites, plugins and posts on an AWS server and practiced migrating it to a new virtual machine. Here are the steps I took and what I discovered. This was using Ubuntu 18.04 and a MySQL 5.6 database.

First, make sure to get a copy of your WordPress code and database. Just zip the entire web root directory of the site:

tar -cvfz wp-code.backup.tar.gz html/

Backup the database with mysqldump. Here my database is named wordpress, I can log in as root without a password, and I am creating a backup file wpdb-backup.sql.

sudo mysqldump --add-drop-table -u root wordpress > wpdb-backup.sql 

…then zip the dump file as well.

tar -czvf wpdb-backup.sql.tar.gz wpdb-backup.sql

Next, we need to transfer the backup files to the new server.  If the servers can talk to each other over SSH, use rsync. Before you can use rsync, create a keypair with no password on the origin server. 

Digital Ocean has a good guide on creating SSH keys here if you aren’t familiar: https://www.digitalocean.com/docs/droplets/how-to/add-ssh-keys/create-with-openssh/

If you don’t have an existing key on the server, you can just type

ssh-keygen

… and hit enter when asked for a password to skip using a passphrase and accept the defaults. If you already have an SSH key, instead of accepting the default name (which would overwrite your key) type in a new name like rsync_rsa_id.

Once your key is created, you now need to copy the contents of the public key you just created. If you used the default settings can see the public key with:

cat ~/.ssh/id_rsa.pub

Next SSH in to the destination server. Copy the public key info to the .ssh/authorized keys file in your home directory.

Sudo nano ~./ssh/authorized_keys

Now you are ready to transfer files. Go back to the origin server and run rsync:

rsync -avz -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" --progress /file/to/transfer/backup.tar.gz   hostNameOrIPAddress:/destination/path 

A quick explanation of the -avz flags used:  

  • -a -archive mode. Makes command recursive, copies permissions, copies symlinks, preserves last modified times, and so on. 
  • -v verbose. See what rsync is doing 
  • -z compress data before transferring 

On target server 

Make sure server has neccesary pre-reqs for WordPress. Apache, PHP, MYSQL, rewrite rules etc. Here is a good guide for Ubuntu: 

https://www.digitalocean.com/community/tutorials/how-to-install-wordpress-with-lamp-on-ubuntu-16-04#step-1-create-a-mysql-database-and-user-for-wordpress

Make sure to create a blank database for wordpress, using the same username and password as on your origin server.

Once the database is set-up it’s time to put your WordPress files in their new homes.

 Move to the root directory and  extract the tarball we created of the WordPress code directory:

cd var/www/html 

sudo tar -xvzf wp-code.tar.gz 

Then extract the database tarball using the same command: sudo tar -xvzf mydbbackup.sql, but instead of moving it to a directory, restore it to the wordpress database you created. 

sudo mysql wordpress < wpdb-backup.sql

That’s it!

Unless your server has a different IP address or hostname, in which case….

If the new server is not the same IP address/hostname as the old one, or if you are using a different database name, username or password, you’ll need to make some quick edits to the wp-config.php file.

Note: Before editing it, ALWAYS BACKUP wp-config.php. Trust me 🙂

cp wp-config.php wp-config-bak1.php 

Then Modify wp-config.

Update the line to your current IP or hostname by editing this line:

define('DOMAIN_CURRENT_SITE', 'newHostName'); 

(If you need to, change the relevant lines for the database as well)

Almost done! For a multisite install going to a host with a different name, you need to make an update directly to the database on the destination server. You can use PHPMyAdmin if you have access to that. Or just use the MySQL command line again:

Access MySQL: 

sudo mysql 

Access the wordpress table:

USE wordpress; 

Update the wp_blogs table with your new address: 

UPDATE wp_blogs SET domain = 'newHostName'; 

Whooo! After all that, you should have an identical functioning copy of your multisite blog network on your new host!

Photo by Barth Bailey on Unsplash

Upgrading PHP on a CentOS Server

My institution uses CentOS for all of our Linux servers, and I understand why. It’s super stable. But! The default software repository also tends to use really, really old packages. Which means I was running a WordPress server that I discovered was using PHP 5.5. No good in general since PHP 7 is much faster and secure, and untenable since the minimum version of PHP has been raised to PHP 5.6 in newer versions of WordPress.

Anyway, it’s easy to use an additional package manager and switch to a newer version of PHP. Here’s how I updated to PHP 7.2 (and by the way, made the WordPress site about twice as fast instantly).

First make sure you have repositories installed:

• EPEL repo
• Remi repos
• Yum-Utils

Steps:

Make sure things are up to date:

sudo yum update
  • Enable PHP 7.2
sudo yum-config-manager --enable remi-php72
  • Install PHP and modules
sudo yum install php php-mcrypt php-cli php-gd php-curl php-mysql php-ldap php-zip php-fileinfo
  • Restart apache
sudo apachectl restart
  • Check PHP version (you should get 7.2.x)
php –v

Photo by AJ Robbie on Unsplash

Tools of the trade: Mac Apps

Today I want to call out a couple of Mac apps that I personally have found to be a great help to me as a systems admin and builder. These are all simple things that help me get work done.

Termius

Termius

Part of the reason I switched from Windows to Mac when I took my job was to have easier access to a bash terminal. But after a few weeks, I found that using the built in Mac Terminal app wasn’t very efficient for me. I had something like a dozen production and development servers to access on a regular basis, plus a growing amount of resources in AWS, and manually typing in hostnames, users, passwords, and tracking keys seemed unnecessarily burdensome. Before long I found Termius, which is a great SSH manager. Now I add my host information into Termius and can login into whatever server I need in seconds.

It can be used on a laptop for free but my boss was kind enough to purchase a license for me so I have extra features and can sync my credentials to multiple devices. That’s cool because Termius works just about anywhere, including my iPad and iPhone. The Mac app looks great, performs well, and also includes an SFTP client and the ability to save “snippets” so you can save and re-use tricky or frequent commands.

Cost: Freemium (60 USD per year)

Pro tip: Organize hosts into Groups and set a default color scheme. I use the classic black and green for my dev servers and a light color scheme on production so I can tell at a glance what environment I’m in.

LastPass

LastPass

Let’s face it, passwords suck. But until biometrics or physical tokens or something else totally take over we’re stuck with them. I had been using this app for awhile personally and my team invested in LastPass about 6 months ago. It has made sharing credentials to access some of our tools much easier (and safer).

Most people probably use LastPass exclusively as a browser plugin that can auto-fill passwords. But it also comes with a pretty nice Mac app if you want to get a bit more in-depth. Beyond username/passwords, there are templates to store lots of useful things: server login info, private SSH keys, account numbers, credit card numbers, and so on.

Cost: Freemium (2 USD per user per month)

Pro tip: When the Mac app is running, you can invoke a quick search of your vault with CMD-Shift-L . From the search results there are buttons to quickly copy usernames or passwords to your clipboard.

Amphetamine

Amphetamine

The life of systems administration sometimes involves a lot of waiting around waiting for a process to finish. Sometimes, you need your laptop to stay awake even if you’d rather step away. Amphetamine is an app that does just that. Just turn it on and your Mac will stay awake without needing to futz with power saving settings. 

Cost: Free!

Pro tip: You can choose criteria to keep your Mac awake: while a certain app is running, for a certain amount of time, or indefinitely. Keep this app on, then lock your screen (CTRL-SHIFT-Power button) to let a process run while you go take a coffee break//walk/nap.

Photo by JESHOOTS.COM on Unsplash

What’s In My Bag?

I haven’t written in a while and wanted to get something up so I’m allowing myself an extremely indulgent “what’s in my bag” post.

I switched a while ago from a shoulder bag to a small backpack, the North St. Meeting Bag. Going team tiny-backpack has done wonders for my back. It’s super light and balanced and fits my laptop, tablet, and other tech essentials (with my headphones attached via carabiner on the back).

 a small black backpack

I had been using a loaded 13” Surface Book 2 in my previous role, but when I took the sys-admin job I switched back to my older Mac laptop. The Surface Book was really nice but I went Mac for the integrated terminal, better portability and a general slight preference for the OS (at least in my work environment where Windows is heavily managed/limited by IT). So my current work rig is a 13” MacBook Pro (2015 model, core i5/8GB RAM/256GB SSD). It’s not the fanciest or fastest these days but it more than gets the job done. I take it into work and plug into a dock with wireless mouse and keyboard and two 27” 1440p Dell monitors. Also making the switch with me is a 10.5” iPad Pro I have from my work with the untethered teaching initiative at my institution. I do the vast majority of my work on the laptop but the iPad gets used for light work on the go (email from the coffee shop, digital white-boarding, notes at meetings, watching training videos, etc). Since I use Termius I can even SSH in to servers on the iPad if I need to.

Aside from my devices, I carry a Moleskine notebook, an Anker external battery pack, various power cables and A/V dongles, a few pens and Apple pencils, a back-up pair of earbuds (with the stupid iPhone lightning dongle) a spare power brick for the Mac, and a slide advancer for presentations.

Gushing Over Git

When I started working in sysadmin/ops land, I knew that version control would be important, and knew a little bit about it. I had used Git in a very basic way while playing around with some amateur coding projects. I knew how to initialize a repo, see which files were tracked, and stage and commit changes.  But I didn’t quite appreciate how crucial a thorough understanding of Git would be to really be effective in maintaining and deploying code in a systems context – with thousands of users depending on availability of a server that needs to be patched, updated, etc safely, securely, and on-schedule. It didn’t take long before I was dedicating as much time as possible to mastering Git. I even created a “Git” section on my OneNote notebook, which is a big deal if you know me.

I knew I needed to learn a lot in a hurry. Thankfully, not only is Git Open Source software with thorough documentation – there is a real open community around git. A great example of this is the Pro Git textbook. This is a fantastic OER (open education resource). The book can be viewed online or downloaded in an ebook format for free. In my case, I downloaded the .mobi Kindle format and used the “Send to Kindle” option to add the ebook to my Kindle library. This allowed me to read the book with variable fonts and text sizes, and to highlight and make notes. There are also tons of good forum spaces for discussing git and asking questions (or more likely finding an answer that has already been given), from Stack Overflow forums specifically for GitHub.

Another virtue of git is that by its nature, it allows you to experiment and play. I cloned a repo of my institution’s production Moodle code into my own safe practice space, knowing that I could try things out and was perfectly safe as long as I didn’t push any commits back. And even if I did somehow end up with bad code, you can always reset back to a working commit. This allowed me to work on git skills and have a Moodle directory to explore and mess-up in a consequence free environment.

The git skills I’ve built have already paid off – aside from being crucial to any sane workflow for updating Moodle or WordPress sites, git has already saved me from crashing a site at least once. I got a bit sloppy on a dev WordPress site and did something to the wp-config PHP file that crashed the site. Yikes! It could have been anything from a missing semi-colon to a misspelled word. Hard to say, but the site was totally dead. Instead of spending a lot of valuable time poring over the file to find the typo, I executed a simple command:

sudo git checkout wp-config.php

The one area of consternation I’ve had with Git has been with submodules. The previous Moodle admin at my school set-up a submodule system to manage all of the various plugins we run (50 or so at last count). In theory this should make managing a suite of plugins easier, but in practice I’ve struggled with submodules, at least in terms of using them at all efficiently, as they make the git process a bit more complicated conceptually. However I am starting to get the hang of submodules now and and getting clarity on how they function in relation to the “superproject” in which they reside. This article from Catalyst, The Git submodule: misunderstood beast or remorseless slavering monster?  was especially helpful for me.

Photo by Fatos Bytyqi on Unsplash

It’s All About the Data

This is going to sound stupid, but…One thing I didn’t really think about enough prior to starting my current job is just how much working with data is involved. Yes, I know. Information systems. It’s kind of all about data. But when I imagined what this job would entail, I thought a lot about building things, about creating systems that support and spark teaching and learning. The reams and reams of data that go along for the – names, email addresses, course CRNs, logs, files, etc – just didn’t seem that sexy to me.

But after a few months of being a sysadmin, I’m thinking about data more and more. There are the obvious things: am I doing enough to keep data safe in the age of constant breaches? Are we thinking about giving users control of their data in any meaningful way in light of evolving views about digital privacy? These are big questions. But in the day to day, I’m just working with like, rows and rows and rows.

It turns out, people have questions they want answered about the digital systems that play such an important role in our schools. These are all questions I’ve answered in the last month:

Which instructors used the accessible Moodle theme we provided in their courses?

I accidentally deleted a quiz, is there any way to get the grades each student in my class got on it?

Can you tell me who at the University has not yet completed this mandatory training?

How many people logged into Moodle on the first day of classes?

Here is the CRN numbers of 50 courses – can you get me the course full name, id number, and the instructor of record?

So what have I learned?

Sometimes this data is readily available within an application itself. For example in the recent versions of Moodle each course has a very powerful completion tracking and activity tracking report built-in. It is also be possible to extend this functionality with plugins.

Reports or logs can also generally be exported into a .csv or spreadsheet format. I was already pretty proficient with Excel but I’ve upped my game in the last few months just due to the amount of time I’ve spent in spreadsheets. (I don’t think I’ve opened PowerPoint in that time, I’ve kind of flip-flopped in that regard. Less presenting polished data and more churning through raw data).

In addition to a sheets tool, a good plain-text editor is your friend when data needs to be manipulated to be useful. I’ve been using VS Code and find it really useful for manipulating text, especially the ability to find and replace with a regex expression. This is incredibly handy when you have a comma separated list but it just has to have each item on a new-line to import wherever it is you need it to go.

If built-in reports or exported info isn’t cutting it, I’ve been going straight to the source itself: typically the Moodle database accessed by PHPMyAdmin. This lets me run SQL queries in a pretty friendly GUI environment and export the results. I’ve found that simple SQL itself is relatively easy to write – it’s understanding the complex structure of a huge relational database, how the tables need to be joined, and thinking through the links between tables that takes a bit of time. But, I am getting faster at this process – I find sketching the query out and making kind of a map of the relevant tables and fields helpful – and this is what has allowed me to do things like recover the grades of a deleted quiz or return a list of instructors who have used a certain theme.

Sketching out a SQL query
Sketching out a SQL query

Photo by Markus Spiske on Unsplash

Patch Notes: The Case of the Mysterious Crashing Server

I’ve been kicking a few ideas around, thinking about what topics or experiences I wanted to write about to chronicle my journey into systems administration/architecture. This week I had the fortunate misfortune to to come across the perfect situation for the first entry in this blog series when I received an email from one of the math professors at UP about our WeBWork server.

A bit of background on WebWork: it’s an open source program for delivering math homework via the web. We’ve run it at UP for a number of years, and recently migrated to AWS , using a single instance running Ubuntu.

Back to our emailing professor. She had been using the system over the holiday break and noticed intermittent lock-ups. She could use the site normally for about half an hour before things went sideways and it became unresponsive; if she waited a few hours and came back it would be working again, but only for about 30 minutes at a time.

After getting the email I checked the server and yeah, it was totally unresponsive via the web and SSH. Time for a hard reboot via the AWS console. After rebooting everything seemed fine but before long the prof let me know she was still experiencing the same issues. This time I was able to SSH in before things went totally haywire – but trying to execute any commands returned an error:

-bash: fork: Cannot allocate memory

Aha! A clue. I rebooted again and downloaded htop:

sudo apt-get install htop

htop is a CLI program for monitoring system usage – including memory. I hadn’t used it before, so I took advantage of the Lynda.com account I have through work and watched a good htop tutorial video to get a feel for it. When WebWork was in use, I could see via htop that there were some processes owned by the Apache server (www-data) that were eating up all of the available memory, of which we only had a paltry 4GB (I suspect that the original sysadmin who built the server intended to use an auto-scaling group that would spin-up additional resources on-demand, but was never able to get this working correctly).

A bit of research turned up lots of forum posts that discussed WebWork’s nasty habit of eating up system memory and failing to give it back, which over time can use up all the available RAM and result in crashes – some useful threads:

Through these posts, I learned about the Apache Max RequestWorkers and MaxConnectionsPerChild parameters. These control how many processes the server can spawn before shutting the oldest/most bloated ones down and can be tweaked to keep WebWork from running too many memory-obliterating tasks at once. It’s a balancing act, though: allow too few simultaneous requests, and unnecessary lag is introduced as Apache is forced to create new processes constantly while memory sits unutilized. A quick visit to the appropriate Apache config file at /etc/apache2/mods-available/mpm_prefork.conf confirmed that the server was still at the default setting of 150 Request Workers and unlimited child connections per requests (0 = unlimited in this case).

StartServers             5 
MinSpareServers          5 
MaxSpareServers          10 
MaxRequestWorkers        150 
MaxConnectionsPerChild   0

A bit more research turned up the install guide for WebWork on Ubuntu and “a rough rule of thumb” of 5 MaxRequestWorkers per 1 GB of memory and a MaxConnectionsPerChild value of 50.

This gave me the formula to determine optimal Apache settings but I wanted to increase the system RAM as 4GB still seemed likely to be insufficient for any heavy use. This was easy to do in AWS as the WebWork instance was a simple single Elastic Block Store (EBS) backed EC2 Amazon Machine Image (AMI).

In the EC2 AWS console:

  • Take a snapshot of the root volume attached to the instance (just in case)
  • Instance state -> Stop
  • Actions -> Change instance type (in my case I changed from t2.medium with 4GB RAM to a t2.large with 8GB RAM)
  • Instance state -> restart

I logged back in and tuned the Apache server for 8GB of RAM with the following settings in the mpm_prefork.conf file (8GB X 5 = 40):

StartServers             5 
MinSpareServers          3 
MaxSpareServers          5 
MaxRequestWorkers        40 
MaxConnectionsPerChild   50

This was followed with a quick restart of Apache to apply the changes:

sudo apachectl restart

I headed to the site and logged onto a test course, trying out some searches in the Library Browser and found that things were improved: I could still see in htop that processes were eating up memory, but they would quickly be killed off and the memory returned to the system. I may have to tweak things when students start logging in and hitting the server with a lot of simultaneous, small requests, but for now so far so good.

Photo by Paxson Woelber on Unsplash

Introducing Patch Notes

2018 was a wild year for me. I had a kid, moved into a new, more technical role at work, and while I haven’t quite yet finished grad school, I’m now close enough that I feel the senioritis kicking in! All that is to say I’m looking forward to what I can achieve in 2019 as I finish school and continue my journey into dad-hood, but it’s the still-new job – Technology Solutions Architect at the University of Portland – that’s the subject of this post.

Moving from the world of edtech, which involved supporting, training, and consulting with faculty on technology tools, to now building, deploying, and administering those tools has been a major transition. As I think about what I know, and what I don’t know, I’ve been reflecting on what helped me to be successful in previous roles and what I can bring to my new one. Something that has stuck out is that I’ve made a habit of teaching, tutoring, writing and making video content that engages with my field. This serves multiple purposes:

  1. I firmly believe that teaching something, whether that’s by presenting on a topic or creating a how-to guide is one of the best ways to build and retain a deep understanding.
  2. It helps me to show what I know and documents my growth in my new field.
  3. Hopefully my content can help others who are in or aspire to learn more about the type of work I’m doing. I’m still in Higher Ed, after all!

So, enter this blog. I’m calling the WordPress category for these posts  “Patch Notes” – it’s my way to reflect on and document my professional growth. Possible topics include:

Moodle, WordPress, Linux, databases, cloud computing, open source, education technology, higher ed, Office 365, and more.

Photo by Franck V. on Unsplash