WordPress Security

From WordPress hosting - “Harden Security”, Ticket 6540

To Do

  1. Admin Accounts (manual install and configuration).

  2. Set WP_DEBUG to false in wp-config.php.

  3. Security plugins (manual install and configuration).

  4. Remove all .htaccess files (hopefully using Salt)

Steps

Admin Accounts

Add 2FA to admin level user accounts using (a good plugin). Perhaps one of these, https://kinsta.com/blog/wordpress-security-plugins/

cron

We run WordPress cron tasks using the Linux cron system, so disable the fire on every page load option in wp-config.php:

# 'wp-config.php'
define('DISABLE_WP_CRON', true);

For information:

  1. Our Nginx configuration disables calling wp-cron.php via an HTTP process e.g:

    location ~* ^/(?:wp-cron.php|wp-links-opml ...
        deny all;
    }
    
  2. The Linux cron task is created using Salt, https://gitlab.com/kb/salt/-/commit/809a65cdd8119a0faeab9afd9d64fe5799694968

fail2ban

fail2ban

The following plugins are compatible with fail2ban:

For more information, see, fail2ban - chat with Malcolm.

Plugins

One of our customers was using Really Simple SSL, https://wordpress.org/plugins/really-simple-ssl/ but this duplicates the features provided by our LetsEncrypt certificates. For more information, see Really Simple SSL - chat with Malcolm

Security plugins

For now, we are using Sucuri, https://sucuri.net/website-security-platform/signup/

Chosen from, https://kinsta.com/blog/wordpress-security-plugins/

User Agent

# pillar
exclude_user_agents
# used in
nginx/include-php.conf

Looking within salt, we can see nginx has an entry for include-php.conf Within this file we can find

{%- if exclude_user_agents %}
{%- for user_agent in exclude_user_agents %}
if ($http_user_agent ~* ".*{{ user_agent }}.*") { return 444; }

This tells us that as long as we include exclude_user_agents when the service is using php We can type in whatever user agent our site is struggling with and block it without having to add ip lists.

exclude_user_agents:
  - Amazonbot
  - PetalBot
  - SemrushBot
  - meta-externalagent
  - facebookexternalhit
  - DotBot
  - BLEXBot
  - MJ12bot
  - ByteSpider
  - coccocbot-web
  - AhrefsBot
  - serpstatbot

This means that all of these bots in the example will be blocked on the basis that they list what they are

In our case;

"Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot) Chrome/119.0.6045.214 Safari/537.36" 0.905"

Amazonbot Is one of the agents we have listed, causing considerable resource drain.

Adding our exclude_user_agents above should allow us to block Amazonbot without needing any blacklists or whitelists.

.htaccess

All copies of .htaccess should be removed from the live folder (and sub-folders) e.g:

/home/web/repo/project/www.hatherleigh.info/live
cd /home/web/repo/project/www.hatherleigh.info/live
# to list the files
find . -name .htaccess -type f
# delete the files
find . -name .htaccess -type f -delete

For more information, see: