I’ve been re-reading some articles about Mastodon from early 2017, right around the time that it started to get some mainstream notice. I signed up for an account on mastodon.social around that time and launched tech.lgbt less than two months later in June 2017. I’ve been a supporter of the community, and have grown to enjoy my time there far more than my time on Twitter, despite having a longer history with the larger platform.

I’ve specifically been reading articles along the lines of “this platform is a fad and here’s why”, which to be fair could still come to pass. I doubt that will happen anytime soon with an additional two years of hindsight and growing communities.

A lot of these takes are reactionary in a way that suggests misunderstanding on part of the writers, or an attempt to garner more clicks for those stories. I can see both as understandable, based on the fallacies presented and the viewpoints that they represent.

Some Common Misconceptions About Mastodon

I’d like to walk through some of the common misconceptions around Mastodon, which generally would apply to a lot of IndieWeb services if they had the same level of notoriety and name recognition.

It Isn’t “True” Federation

This argument is the one that immediately lets me know the intent of the writer who espouses it. The argument is that technically, since Mastodon instances (the different sites running Mastodon software) are allowed to block other sites or control who has access, they aren’t truly federating.

What is Federation?

First, a definition. Federation in this sense is a bit nebulous, as there are a few ideas of what it means. Generally, the idea is that a user can have an account on a site where all of their information lives. You can have accounts on multiple sites if you want to have separation of identity, but you don’t need to. You can then interact with people on other sites, as long as neither side is stopping the other side from communicating.

In my case, that site is https://tech.lgbt/@david and you can contact me from almost any other instance by using @david@tech.lgbt. This is similar to my Twitter handle of @DavidWolfpaw, but with the addition of the server name that I am hosted at. I can talk to my friend Chris by sending a message to @chris@mastodon.chriswiegman.com, which will notify him in the same way that he would get on Twitter. Both of us are on separate instances of Mastodon, but we can communicate freely between them thanks to federation.

What is True Federation?

When people talk about Mastodon not actually being federated because instances can be locked down, or block other instances, I’ve invariably found that they come from a place of free speech absolutism.

Let’s be clear: I am not a free speech absolutist. I say as much in the Code of Conduct for tech.lgbt, where I have some rules set for being allowed to play in my sandbox. This is not unreasonable, and I state as much when I include that myself and any moderators have the sole discretion of what we consider unacceptable speech in our spaces.

I’ve never had anyone in good faith argue to me that I am silencing oppressed minority groups. The argument has only been used toward me by individuals that believe that a right to free speech includes a right to a platform. You can say things that are hateful or derogatory to others, but not on the server that I manage and am footing the bill for.

There are people using Mastodon that I don’t agree with. People who claimed a cartoon frog as their mascot and believe that my blocking their activity on my server amounts to abridging their first amendment rights. Saying that Mastodon isn’t truly federated because I can block them (as I can do on Twitter) is complaining that you want to say things and you want to force other people to listen. This is less a misconception than an intentional misunderstanding and misrepresentation of community behavior, but one that I see a lot of.

Screenshot of Twitter thread about Mastodon instances blocking other instances.
A screenshot (since Twitter isn’t forever!) of this thread on instance blocking

You Cannot Secure Your Identity

Another common misconception is around identity in the fediverse (the nickname given to communities connected via oStatus and ActivityPub enabled software like Mastodon and others). I admit, it isn’t something that most people are used to thinking about, given that most of our exposure to social media over the past few decades has been through walled-gardens, siloed off from one another. There is only one person with my username on Twitter, and likewise there are usernames that I want that people signed up for accounts with nearly a decade ago and don’t use that I can’t have.

I use @david as my handle on my Mastodon instance, but there are surely people using @david on instances elsewhere. The inclusion of my instance name is what fully identifies me and separates me from the other Davids out there.

This is more visible than on other platforms, but is not a unique concern. Think of all of the people named John Smith on Facebook. They all get to use the name that they want as a display name and be found by it, but Facebook identifies each of them separately by a unique ID. Or think of email services, which use the same username@domain to identify recipients. There is a david@hotmail.com who is likely different from david@aol.com and david@gmail.com, none of whom are me. With the combination of username and domain name you can identify the person that you want to send a message to. This system is so ingrained into our usage of email that we don’t even consider it, instead calling the full username and domain combo an email address.

You Cannot Bring Your Followers

This can also be described as a lack of portability. Though notably this misconception is less about the portability from Mastodon, which allows you to easily move instances if you choose, and more about the lack of portability offered by Facebook, Instagram, Twitter, and other social networks. Their businesses are built upon a locked-in network effect, where you have to use them if you want to interact with others that use them.

Via the settings interface of Mastodon you can migrate your account to another instance. This means that there is a built in way for you to leave one server and move to a different server without loss for whatever reason that you choose.

True, you won’t be able to automatically get all of the people from other platforms to follow you to a new platform, but that’s a network-effect problem that isn’t unique to Mastodon or other federated services. It’s more telling that the issue really stems from lack of interoperability with older platforms than with a service that happily lets you move freely, and allows for multiple integrations like Twitter/Mastodon cross-posters.

There is Poor Discoverability

Related to the prior fallacy of portability, there is an issue with discoverability on Mastodon as compared to sites like Instagram and Twitter which are partly built on being able to search for content. How many people actually use those services in this manner outside of hashtags is up for debate, but it is one difference between them that could be seen as a shortcoming.

By default, Mastodon instances only allow searching via hashtags. This means that someone has to have explicitly opted in to making their content searchable, a key distinction. On Twitter I regularly see people use misspellings and self-censorship of toxic terms to avoid dog-piling that can come from people who cruise loaded terms to find people to harass. There was a time that I did the same, avoiding using the term gamergate directly in any of my tweets, out of the concern that I’d be inviting bad faith interaction.

You can still search for individual users, search for content under hashtags, and on some instances do general searches. But there’s no simple way to do a fediverse-wide search on specific terms, and that’s partly the point. Mastodon is not seen as the place to grow your following and build a brand. It’s still a place to find new friends and rebuild your own networks in a different environment.

One last note about hashtags: you can use them in your profile for discoverability, create and pin an #Introduction post, which is fairly common, and you can also highlight specific hashtags on an instance as an admin, to let others get an idea of what it’s about.

Mastodon instance hashtags
Hashtags can show up as topics on the intro page to a Mastodon instance

It Costs Too Much

Finally, I’d like to discuss cost, one of the other misconceptions of the older articles that I’d read. As a general user, you can join any number of free instances. The costs are generally borne by the admins of the instance, sometimes helped out by donations, like Patreon accounts (here’s my plug!). There are some instances that are membership only, with or without some sort of dues. Finally, you can host your own instance, which can cost more or less depending on your needs including server performance and number of users.

The cost is not insignificant, but I’m also using Mastodon in a way that is meant to support multiple users. There is a built-in method to limit users of your instance, or even make it a single user instance, so that all activity on it is created and managed by one person. That could run well on the most budget level of VPS. I can imagine that something like mastodon.social can run into thousands of dollars for hosting, but that is a drop in the bucket of hosting costs that are hidden from users by larger social networks.

In my case, I am running https://tech.lgbt on a $15/month Digital Ocean droplet, after migrating recently from Linode. I also pay them for their cloud backup solution at $2/month as a cheap just-in-case extra peace of mind. I pay around $39/year for the domain, which is due to the unique TLD that I’ve chosen for it. On top of these costs I pay for AWS media storage to make it serve faster and cheaper, which I estimate at under $5/month. In total, this brings running the instance to around $25/month, which is honestly around $1/month/user for how regularly active some people are.

Final Thoughts

A word that I’ve been using a lot recently around various people and modes of discourse is disingenuous. Much of the criticism around Mastodon and other federated platforms has been missing the point or simply incorrect. It’s not that there is no true criticism, which should exist for every platform and mode of thinking. It’s that much of it appears to be coming from a disingenuous place, from those looking to further an ideology of division over one of community founded upon mutual respect.

There are flaws in Mastodon, as in every platform and set of communities. But the above misconceptions are not those flaws, and are misunderstandings that I hope to clear up.

If you have any questions about these services, I can try to help answer them, or at least direct you toward resources that may better be able to help. I’m happy to discuss here, or via your own Mastodon account directed toward https://tech.lgbt/@david. See you in the fediverse!

Next week I’ll give a primer on Mastodon and how to get started with it as a user. Subscribe to get notified when this and other new articles are published.

Subscribe to the 🐺🐾 Newsletter

I’m currently reading Gretchen McCulloch’s new book, ‘Because Internet’, which serves as an overview of linguistic study of the evolution of written (and sometimes spoken) communication brought about by the mass adoption of the internet. Both McCulloch and I implement one of the changes noted earlier on: the word internet has lost its capitalization over the years, with the AP style guide making it “official” in 2016.

But the question that I want to address isn’t the correct usage of capitalization, or even correct grammar in general. After all, I’ve been known to start sentences (or even full paragraphs, as is the case here) with conjunctions, and I’ll propose that ending with a preposition is where the written word is going to.

Instead, I want to discuss my usage of double quotes around the word official above, emphasizing my usage of the word, and providing a big no-no, at least according to Weird Al.

What is Official Usage Anyway?

While the thrust of my argument is that language rules are nothing more than convention, I’ll buttress my credentials in making this claim. I studied linguistics as part of my degree, though I did not take it nearly as far as McCulloch or professional linguists. Language as a concept has always interested me, as we are exposed to the spoken and written word at almost all times and don’t often stop to give thought as to why things are the way they are. Only when something is egregiously incorrect do we consider how a statement could be made more clear.

I imagine that it’s a stereotype that I got interested in computers because they are predicatable in a way that other humans aren’t, and that they are understandable in a way that language so often isn’t. Nuance can be hard to grasp over the myriad inflections, intonations, cadences, rhythms, pitches, and all other vocalizations that I’m forgetting. Try asking several people how they are feeling, and see if you can identify the differences in each utterance of “fine” that you receive in reply.

I can’t speak to the experiences of a large portion of the world who grow up in multilingual environments and intuit the rules in a different way. Human language is taught to be the same as programming languages when it comes to translating and learning while in higher education. You take a phrase, break it into component parts, find the right words, and consider the rules on order and variant usage. Sometimes you luck out with cognates, sometimes you have to think about the parts of speech in theory more than you ever do in your first tongue.

The difference between language spoken in public versus what you’re taught in school can be huge. The rules that are followed can help while learning, but you’ll quickly realize that humans abhor authority when expression of meaning can only occur when directly contradicting many official usage edicts that are passed down as if they were always in existence.

Decentralization of the Meaning of the Written Word

The internet (and the lowercase-w web, while we’re at it), have done a great deal to increase proliferation of alternative writing. This includes abbreviations like wtf, lol, and ftw, which serve the purpose of expressing a longer thought in fewer letters, a boon to slow typists and fat-fingered phone users alike. This also includes the introduction of new concepts and phrases, or words becoming untethered from their original meaning to give us entirely new abstractions. Snowflake used to describe a hexagonal formation of ice crystal falling from a storm, not as a derogatory term towards someone who is being sensitive.

For the supposed purists that cannot get behind the figurative use of the word literally, or the addition of new words like bougie and rando, please explain to me why we bother with flammable and inflammable, both words that predate the internet and signify that something is going to catch fire.

No one is waiting for permission to update language for the present and future based on but not being bound by the past. As McCulloch points out, many conventions, especially in English, come from historical conquests and infighting, as well as an embarrassment of the language as a second class citizen among other languages that were perceived as more civilized.

If there is concern over the descent of communication due to the visual nature of the internet, consider that children and teens are being exposed to more written and spoken language than at any time in the past, and are likewise writing far more than at any point in history. You may argue that status updates aren’t a replacement for the Great American Novel, with it’s capitalized gravitas, but you are unlikely to have penned ‘Moby Dick’ before making such a declaration.

I find some validity in the concern that language practices could cause a bifurcation of meaning, but only in short spurts. Sure, there are some higher profile differences between lol signifying “laugh out loud” versus “lots of love” depending generally on age, but that makes my point even more clear: decentralization of language doesn’t belong to the youthful internet denizens alone. It belongs to anyone who is unafraid to take a leap of faith with their phrasing, and embrace when others likewise reciprocate with new meaning of their own.

I haven’t been doing a good job of recapping most of the WordCamps that I’ve been to. I did a recap of WordCamp Atlanta back in May, but since then I’ve been to Jacksonville and Montclair. I also have a few coming up, including Denver this weekend. My goal is to better do write-ups of things that I’ve learned at these events. Hopefully I can pass along some useful information to you, or at least remember the events better for future me.

With the number of WordCamps that I’ve gone to this year and the number still coming up, I’ve opted to make trips a bit shorter to better fit my schedule and budget. I’m immensely grateful to SiteGround for making it easier for me to attend WordCamps as an ambassador, not least because I am already a regular user of their hosting and services, and recommend them at our Meetups.

Travel

I flew into Boston on Friday and left Saturday evening after the full day of sessions. I did miss a few talks that I wanted to see on Sunday, which is even more disappointing considering the quality of the talks that I saw on Saturday. It was easily one of the best collections of talks I’ve attended at any event over the past few years.

WordCamp Boston 2019 venue

I started the morning off with a train ride from my hotel to Boston University, where the event was held. The transit that I used was pleasant and affordable, and it makes me wish that we had a more robust system in Orlando for public transit. While I know that I should use our bus system before complaining about it, the inconvenience is immense. I’m grateful that I don’t have to rely on it regularly. I have to make the decision to drive 15 minutes or take three buses over the course of 90 minutes (which leave only once per hour) to join my weekly blogging group, and that doesn’t include the mile of walking to get to and from those buses.

I arrived early with the intent on getting coffee and some pre-event work done. I was derailed by the coffee shop that I was going to opening an hour later than expected. I took the opportunity to take a walk through the neighborhood, sitting in a shaded park for a bit to just think and look at animals that we don’t get in Florida. Boston was going through a heat wave, which amounted to a pleasant autumn day in Florida.

Sessions

The first session that I attended was “The Future: Why the Open Web Matters”, delivered by Aaron Campbell. Aaron walked us through a bit of history of the web, some of the challenges that it faces, and what we can do about it. Considering that I was delivering a talk about IndieWeb later in the day, I knew I’d want to see what he had to say. After the talk I spent a bit of time chatting with him, where it became clear that even when a problem can be agreed upon, solutions aren’t quite so easy.

Aaron D. Campbell delivering his presentation "The Future: Why the Open Web Matters" at WordCamp Boston 2019
Aaron D. Campbell delivering his presentation “The Future: Why the Open Web Matters” at WordCamp Boston 2019

Following that talk, I went to see Kathy Zant deliver her presentation, “The Hacking Mindset: How Beating WordPress Hackers Taught Me to Overcome Obstacles & Innovate”. Kathy shared her experience getting started in security, some common mistakes that people make, and ways to fix them. She made a great point that WordPress is so large that you’re going to be regularly attacked just for using it on your site.

Kathy Zant delivering her presentation "The Hacking Mindset: How Beating WordPress Hackers Taught Me to Overcome Obstacles & Innovate" at WordCamp Boston 2019
Kathy Zant delivering her presentation “The Hacking Mindset: How Beating WordPress Hackers Taught Me to Overcome Obstacles & Innovate” at WordCamp Boston 2019

I also spent a bit of time at the Happiness Bar helping with website issues, as well as the hallway track of chatting with friends and sponsors. Some of my favorite conversations come from these moments where we have a chance to dig deeper than we do in online conversation. I do sometimes get deep into a conversation and realize that I’m missing a talk that I would otherwise have attended, but the memories and actionable advice that I get during these impromptu chats are just as important.

The final talk of the day stood out to me most. That was “The World-Wide Work”, delivered by Ethan Marcotte. He touched on some of the same topis that Aaron and I did, but focused even more about biases encoded into design, both intentional and unintentional. He similarly bemoaned the darkening of the web, and what he sees as a potential path forward. Ethan was unfortunately cut off near the end as he ran over time, but I would have loved to discuss the points that he brought up at length. Alas, I was unable to stick around for long or attend the trolley tour, as I flew out a few hours later.

Ethan Marcotte delivering his presentation "The World-Wide Work" at WordCamp Boston 2019
Ethan Marcotte delivering his presentation “The World-Wide Work” at WordCamp Boston 2019

My Presenation

Again, I’m thankful for the opportunity to attend so many events, meet new people, share ideas, and receive support for doing so from SiteGround. I try not to swoop into events just for my portions, and prefer to stay and interact for the full duration. It is nice to be able to spend the night in my own bed after giving a talk though.

My slides for my presentation, “WordPress and the IndieWeb: Why You Should Own Your Voice”, are available here: WordPress and the IndieWeb: Why You Should Own Your Voice. I welcome conversation about that here, via Mastodon, or on Twitter.

Let’s work together to make the web a more open, equitable place for everyone!

One of the things that I handle for clients at FixUpFox is site speed optimization. I also help with performance optimization, which I almost lumped into this post, but I think that it deserves its own overview at a later date.

I also began blogging more regularly on this personal site, thanks to the support from the Blogging Accountability Group that I formed with a few members of the WordPress Orlando Meetup. One of the things that I want to do is to improve the theme that I use on the site, and one of the ways that I want to do that is to improve site speed.

There are quite a few reasons to want to improve the speed of a site. Most concerning to me are based around resource usage and visitor experience. Loading a smaller page does a lot of good, including:

  • Less bandwidth and resource usage for visitors, improving loading speed and battery life
  • Lighter energy usage footprint, reducing ecological impact slightly

Here’s a quick overview of what I did for my personal site, which can be what I do for a client site, depending on their needs. I have to note that I don’t do much in terms of tracking on my site, and I don’t run ads. Those are often the two biggest blockers that I have to increase page performance for clients.

A baseline of performance

The first thing that I do is establish a baseline of the site. That lets me get an idea of what improvement that I end up with, but also places to start looking at making changes.

The following data was from my WordCamp Atlanta Review post tested with tools.pingdom.com. I also use Google PageSpeed Insights, Lighthouse, and GTMetrix depending on what I’m looking for.

Performance grade: C — 77
Page size: 1.1 MB
Load time: 1.37 s
Requests: 56

The above indicates that I loaded 1.1MB worth of data on this page, taking an average of 1.37 seconds over their tests, and that 56 separate files were requested to load this page.

This isn’t terrible, but I had a feeling that I could do better with a few simple changes.

Removing Unused Scripts, Styles, and Fonts

First, I started by reviewing external requests. That includes any JavaScript files, CSS stylesheets, and fonts that load along with the rest of the page. I had 56 requests being made to load that page, which is far from unusual, but a bit high for a personal post about a trip on my non-monetized site.

Boilerplate Scripts

A plugin that I use to manage things like my Speaking custom post type was made with the WordPress Plugin Boilerplate. That helped save time in setting the plugin up, but it also added a bit of code that I didn’t need, including display script and style files which I wasn’t making use of. Eliminating the code that called those files eliminated two requests that did literally nothing at all.

jQuery Migrate

Next, I looked at jQuery Migrate. This is a script that WordPress loads to help manage old code. It acts as a bridge between the latest versions of jQuery and code that is written for very old versions of jQuery. Since my site is running up to date code in the theme and plugins, I could remove this script. I can’t always say that it is possible to remove it, but you can try and see if anything breaks on your site. Most current themes and plugins have no need for it, so I removed it with the following code taken from this Dotlayer article.

//Remove JQuery migrate
function remove_jquery_migrate($scripts)
{
    if (!is_admin() && isset($scripts->registered['jquery'])) {
        $script = $scripts->registered['jquery'];
        
        if ($script->deps) { // Check whether the script has any dependencies
            $script->deps = array_diff($script->deps, array(
                'jquery-migrate'
            ));
        }
    }
}

add_action('wp_default_scripts', 'remove_jquery_migrate');

FontAwesome

Next, I looked at FontAwesome. loading the script to enable that on my site was the largest file, accounting for around 25% of the page load size.

I really like FontAwesome. It’s made it a great way to get social icons and other useful site iconography without having to find and load new pictures for each. Plus, it flows smoothly between code, the content editor, and stylesheets.

It turns out that I was only using three icons from FontAwesome across the entirety of my site: the hamburger mobile menu icon, the X close icon when the mobile menu was open, and the moon icon for the basic night-mode that I have a feeling no one even realizes is for that purpose (more on that in the future).

Since I didn’t have a lot of icons to replace, I decided to rebuild those in CSS only. There are a variety of places to find tutorials on drawing in CSS, something that I hope to do in the future as well. For now, I redid those icons in CSS and removed reference to FontAwesome from the theme, saving a lot of the page size in the process.

WP Emoji

WordPress has its own emoji loading, which is useful for the amount of times that I like inserting a 🐺or a 🐾 or a 😝into what I’m writing. But you can see from the previous sentence and my bio at the end of posts that I still have them displaying with system defaults of pretty much every device that would view my site without having to load them.

I’ve used the following code to blanket remove the WordPress emoji from my site load, though you might have reasons for keeping some of these on.

/**
 * Disable the emoji
 */
function disable_emoji() {
	remove_action( 'wp_head', 'print_emoji_detection_script', 7 );
	remove_action( 'admin_print_scripts', 'print_emoji_detection_script' );
	remove_action( 'wp_print_styles', 'print_emoji_styles' );
	remove_action( 'admin_print_styles', 'print_emoji_styles' );
	remove_filter( 'the_content_feed', 'wp_staticize_emoji' );
	remove_filter( 'comment_text_rss', 'wp_staticize_emoji' );
	remove_filter( 'wp_mail', 'wp_staticize_emoji_for_email' );

	// Remove from TinyMCE
	add_filter( 'tiny_mce_plugins', 'disable_emoji_tinymce' );
}
add_action( 'init', 'disable_emoji' );

/**
 * Filter out the tinymce emoji plugin.
 */
function disable_emoji_tinymce( $plugins ) {
	if ( is_array( $plugins ) ) {
		return array_diff( $plugins, array( 'wpemoji' ) );
	} else {
		return array();
	}
}

Handling Images

I realized that a lot of images were returning too large on my site. This means that I had a space where an image would load, say an 80px square for a profile icon, yet a larger 512px square was loading for it.

In this case the issue was Webmentions, which I was handling via an IndieWeb plugin. I was showcasing people who liked or retweeted my post on Twitter, as well as those who commented on it.

A whole discussion should be had about the display of metrics, the privacy of individual interaction, and a need to prove popularity through that interaction. For now, my focus was on speed, and I’ve turned off profile image loading. I’m going to be reviewing my usage of this type of interaction demonstration, while still showcasing some cool features that IndieWeb proponents have given us.

Resizing default media sizes

The large image size on my site was set to the default of constraining to a box of 1024px by 1024px. This is a good default for some sites, but for me I had nowhere that images were displaying that large generally, even on wide screens. There were images loading that had to be resized by the browser and ended up loading larger files than needed.

If you are on your site dashboard, you can go to the Settings panel on the left sidebar and select the Media page. There you can change the default media settings to sizes that make sense for you. In my case, I made thumbnails a different size to fit my design. They ended up being larger than the default, but allowed me to avoid having to crop to fit featured images manually, and are smaller than the medium size that I would have otherwise loaded. I also made large images constrained to an 800px by 800px box, which is around 61% of the size of the original image size. This is still large enough for my theme, but saved some space in what is often the largest portion of page load.

If you change your image size on an existing website, you’ll also want to regenerate the cropped and resized image files that WordPress makes. The plugin Regenerate Thumbnails is my go-to for this task. You can even set it to only resize featured images, if those are the only ones that need to change.

Performance improvement after making changes

Now that I’ve made a handful of changes, it’s time to test the site again to see how we’ve done. I’ve gone back to tools.pingdom.com and ran the test again, being sure to select the same server location to get comparable results.

Performance grade: C — 80
Page size: 563.2 KB
Load time: 442 ms
Requests: 36

The performance grade has barely gone up, but that’s more of an overall estimation based on what they think is important, which doesn’t always apply to your site. What has improved is the page size, which is about half of what it was before optimization. That’s already a huge savings! It’s also loading in about a third the time, and I was able to remove 20 requests from the page load.

At this point we could probably be done and move on, but I figured I’d try a few other small things to improve page speed. I wasn’t doing any concatenation of files, and that seemed like the next best place to reduce the number of requests, increasing load speed.

What about performance optimization plugins?

I have used a few plugins for minification and concatenation in WordPress. Minification means stripping out unnecessary spaces and comments in files that make them much easier to read for humans, but aren’t needed by the computer. Concatenation is taking files and combining them together into one larger file. While it takes up more space it is one fewer request to make, which again can be a big driver in performance and speed.

Fast Velocity Minify and Autoptimize are two free plugins with a variety of free and paid extensions that you can take advantage of. I’ve never used the paid extensions to offer any insights, but the free versions work very well.

In this case I chose Fast Velocity Minify, which I installed on my site and activated with default settings only, no modification. Running the speed test again gave me the following results:

Performance grade: A — 91
Page size: 607.3 KB
Load time: 443 ms
Requests: 19

There’s a few things to note here. The performance grade greatly improved, and the number of requests was cut nearly in half, which is a big contributor to that score.

But we also see that load time is basically unchanged, and the page size actually increased, despite the fact that we tried shrinking the number of files that loaded.

These optimization plugins can do great things for your site, though they can also add headaches with another layer of caching and the possibility that concatenation breaks the order that scripts need to load to function. I still use them on a lot of sites, but I think it’s important to note that they aren’t a magic fix and are a bit more complex in how they handle your site content.

tl;dr — A recap of what I did to improve performance

I’m going to let the following screenshot (that I optimized, of course) from Google PageSpeed Insights speak for itself in terms of what a few minor changes that took me about an hour worth of work did for my site.

And here are the results that I got for the site at various testing times:

Before OptimizationAfter OptimizationAfter Optimization Plugin
Performance gradeC — 77C — 80A — 91
Page Size1.1 MB563.2 KB607.3 KB
Load Speed1.37 s442 ms443 ms
Requests563619

I’ll encourage you to consider deeply what optimization steps apply to your website and which don’t, but here’s a short list of things that I did that you could try.

As a reminder, I am basing this case study on a personal site that does not run ads (though it has a Patreon to support my writing and tutorials!), and uses a single Google Analytics tracking script. When it comes to sites that do extensive tracking or use advertising networks, there are a host of other things to consider.

I hope that the above gives you a place to start when you begin looking at increasing performance and speed on your site. Now go make the web faster and better!

PHP normally only displays fatal errors in the browser, or doesn’t load any page content and gives you a “White Screen of Death (WSOD)” if it has a fatal error before the page can load. A fatal error is one in which something is so wrong in the PHP code that it cannot make sense of it to gracefully fail in the background. If I forget to add chocolate chips when making cookies, I still have a perfectly tasty dough. If I forget to add baking soda though, I end up with a flat, gooey mess.

WordPress has a few features built in to make it easier to see PHP errors while you are testing. You’d want to activate these while developing a new site, theme, or plugin to ensure that you are seeing any PHP errors that come up.

As mentioned in the last post on PHP Illegal Strings, there are a few failure types in PHP including warnings, notices, and errors. Turning on WP_DEBUG will allow you to see those failure types so that you can fix them in your code.

Activating WP_DEBUG

If you have access to all of the files of your WordPress install, you’ll want to edit the wp-config.php file, which is located in the root directory, meaning the same folder that has the wp-admin, wp-content, and wp-includes folders. You’re going to go into that file and add the following line of code near the bottom of the file, but before the stop editing notice:

define( 'WP_DEBUG', true );

/* That's all, stop editing! Happy blogging. */

If that line already exists but says false, change that to true. There can be other lines of code above or below this, but as long as it’s above the comment to stop editing, it’s in the right spot.

What did we do?

We’ve now told WordPress that rather than hide PHP errors, we want them to display while viewing the site. WP_DEBUG is a PHP constant, which by convention are written in all caps. We’ve used the PHP define() function to set the value of that constant to a boolean true. Note that we didn’t surround the word true in quotes, otherwise PHP would read it as a string.

Using WP_DEBUG also allows us to see any deprecated functions that are running on our site. Deprecated functions exist in WordPress but are no longer the standard way to perform a particular tas. As an example, long ago in WordPress history you would get the ID of a category in WordPress with the function the_category_ID(), but now the function to do the same in a better way is get_the_category().

Using WP_DEBUG_LOG and WP_DEBUG_DISPLAY

There are some companions to WP_DEBUG that can be used to make it even more helpful. You may not always be able to easily see errors if they are loading behind content, or you may want to keep track of them over time to review later. Two other constants that are built into WordPress that can help are WP_DEBUG_LOG and WP_DEBUG_DISPLAY.

Using WP_DEBUG_LOG

Setting up WP_DEBUG_LOG allows you to save all of the debug errors that are getting displayed to a file in your WordPress install. That file gets saved to wp-content/debug.log by default. Whenever an error occurs that WP_DEBUG would display, it will also get saved to that file with a timestamp of when the error occured.

To turn on WP_DEBUG_LOG you’ll want to define the following constant:

define('WP_DEBUG_LOG', true);

You don’t have to worry about creating the debug.log file if it doesn’t already exist. WordPress will do this for you automatically as soon as it has an error to log. So hopefully not right away!

Changing WP_DEBUG_DISPLAY

By default, setting WP_DEBUG to true will display all errors on the screen in your browser, on both the visitor-facing frontend, and the admin-facing backend of your site. This is ok while you’re editing a site that isn’t live with other people using it, but you don’t really want those errors displaying to other site visitors. It will make the site look more broken than it is, and can even be a security concern.

If you want to use WP_DEBUG but don’t want to display errors to the screen, set the following constant:

define('WP_DEBUG_DISPLAY', false);

Again, if you don’t set that as false, it will default to true when debug is turned on. If you are setting it to false, you’re probably also setting WP_DEBUG_LOG to true, since otherwise you won’t see the errors on the screen or in a debug log.

If you want to passively log errors for review later on a live site, just in case any come up, you can combine the three definitions above to turn on debug mode, log errors, and stop them from displaying on the site. I recommend doing this if you don’t have anything else handling these error logs for you, which you’d probably know if you did.

// Enable WP_DEBUG mode
define('WP_DEBUG', true);
 
// Enable Debug logging to the /wp-content/debug.log file
define('WP_DEBUG_LOG', true);
 
// Disable display of errors and warnings 
define('WP_DEBUG_DISPLAY', false);

Continuing to Debug Your Site

The settings above display PHP errors, but they don’t actually do anything to fix them. You’ll need to handle that yourself. Still, they provide an invaluable source of information to determine why something is broken on your site. This won’t show all types of errors that could occur, since not all broken page or feature problems are PHP related.

What it does do is give you a good footing to begin the fun part of debugging: digging into code and squashing bugs as you find them. In this case the old proverb is true: Knowledge is Power.