apt get life

Life around technology

  • Technology
    • Guides
    • Linux
    • Development
      • Laravel
    • Misc
    • Raspberry Pi
  • Writing
  • Crafts
    • Crochet
    • Model Making
    • Painting
  • Privacy Policy
You are here: Home / Archives for Technology / Guides

Speed up Apache Websites with Expires Headers

2014/08/24 by sudo

Page speed can be a big issue for site owners, developers and systems administrators alike. There are many things you can do at an application level to improve performance, but that takes a long time to review, write, test and implement. What about the quick gains, the things you can do quite easily that will improve performance? Well it turns out that you can speed up an Apache based webserver by simply enabling a module: expires headers.

Expires headers are part of the the computer code that gets exchanged when you access a page. You browser requests a page by sending a request to it, and the server responds with information indicating what it is that has been returned. As part of this response there’s a section called “Expires”. This indicates when the content that has been accessed on the website is going to “expire”. To understand this a bit better, you need to know that when a website’s content is loaded, it’s downloaded to your computer to be rendered in your browser. Once the site is on your screen, if expires headers are not set, each time you load the site it’s going to be downloaded again. This is a performace hit to you, your internet connection and your computer. Expires headers tell the computer to store content temporarily (“Cache” the content) on your computer. When you visit the site again in 5 minutes, if expires headers are set correctly, you’ll only download part of the information on the page.

So what should be cached? Well, here’s what content types I tend to set expires headers on:

  • Images (jpg, png, gif)
  • CSS
  • Javascript
  • content such as mp3, mov, mp4 and others that don’t change regularly

How do you do it? You can Speed up websites that operate on Apache using mod_expires. This is really simple to setup and configure if you know how to configure sites on the command line.

Enable the module

a2enmod expires

Edit the configuration file. This can be done in either /etc/apache2/sites-enabled/mysite.conf or /etc/apache2/mods-enabled/expires.conf. The virtualhost configuration file will enable it for a single site, the mods-enabled configuration file will enable it for all sites. Choose one and edit it with a command line text editor like nano. Enter the following:

          ExpiresActive on

          ExpiresByType image/jpg "access plus 30 days"
          ExpiresByType image/png "access plus 30 days"
          ExpiresByType image/gif "access plus 30 days"
          ExpiresByType image/jpeg "access plus 30 days"

          ExpiresByType text/css "access plus 7 days"

          ExpiresByType image/x-icon "access plus 1 month"

          ExpiresByType application/pdf "access plus 1 day"
          ExpiresByType audio/x-wav "access plus 1 month"
          ExpiresByType audio/mpeg "access plus 1 month"
          ExpiresByType video/mpeg "access plus 1 month"
          ExpiresByType video/mp4 "access plus 1 month"
          ExpiresByType video/quicktime "access plus 1 month"
          ExpiresByType video/x-ms-wmv "access plus 1 month"
          ExpiresByType application/x-shockwave-flash "access 1 month"

          ExpiresByType text/javascript "access plus 1 week"
          ExpiresByType application/x-javascript "access plus 1 week"
          ExpiresByType application/javascript "access plus 1 week"

Now restart or reload apache to apply the configuration

service apache2 restart

You can see from the code that the expires time has been set by content type. Each is different, depending on how often it’s expected to change and how big the file types are going to be. For example – a movie file is unlikely to change frequently, but is likely to be large, so if it’s got an expires header telling the browser to store it locally for up to 1 month after the date on which it was first accessed it. This makes the site faster to load. Now when someone loads a site on your server, it will store content after the initial page load and reduce subsequent loading time.

 

For further information on improving page speed you can check out Yahoo’s excellent article here: https://developer.yahoo.com/performance/rules.html

Why not checkout the Firefox and Chrome plugin “YSlow” which checks a range of potential speed issues and offers solutions:

  • FireFox: https://addons.mozilla.org/en-US/firefox/addon/yslow/
  • Chrome: https://chrome.google.com/webstore/detail/yslow/ninejjcohidippngpapiilnmkgllmakh

Google also has some useful tools and guides which can be found here: https://developers.google.com/speed/

Moz also has a brief article on the subject here: http://moz.com/blog/15-tips-to-speed-up-your-website

Filed Under: Guides, Technology, Uncategorized Tagged With: apache, expires headers, Linux, page speed

Run a Remote PHP Web Script from the Command Line with WGET

2012/10/03 by sudo

So, you have a webpage that runs a script which you need to automate? Command line and contab to the rescue!

 

Wget, the linux command line tool can “get” PHP pages and execute them, displaying the contents in an output file. This makes it incredibly useful for managing automated jobs inside content management systems. It’s really simple to use:

wget -q -O output.log "http://example.com/example_script.php"

wget simply runs setting the output to a logfile with the request at your webpage’s script as a full URL. Quotes around the URL are highly recommended. You can tell if the script has finished by looking at output.log and making sure the closing HTML tag is there.

It’s really easy to add this to a crontab for automation. Simply edit your crontab from the terminal interface (crontab -e) and add the line as you require:

0 6 * * * wget -q -O output.log "http://example.com/example_script.php"

This runs the wget command at 6am every day.

Filed Under: Guides, Technology Tagged With: command line, php, wget

Postfix; Delete Mailq Messages From a Single Sender

2012/09/27 by sudo

So this is a similar post to the previous one on deleting messages to a single recipient, except this focuses on deleting messages from a recipient. It works quite well if a spammer blasts you from a single address.

To remove the nuisance mail from the queue, use a similar command as last time, except $7 indicates the senders address:

 

mailq | grep -v '^ *(' | >awk 'BEGIN { RS = "" } { if ($7 == "user@example.com")print $1 }' | tr -d '*!' | sudo postsuper -d -

*Note this command is cut of due to the width of the site. if copying make sure you select all of the text.
 

This will search the mailq for messages with a specific sender “user@example.com” and pass the message(s) to postsuper with the delete option. Please note that you will need to update the email address used in the example with the one for your sender. The command is also meant to be on one line, and has been reformatted to display on the website.

 

You can find more information from the content-source: http://themomorohoax.com/2010/05/10/deleting-messages-to-a-specific-recipient-from-the-postfix-queue

Filed Under: Guides, Technology Tagged With: email, mailq, Postfix

Setting email quota in Courier-MTA

2011/03/22 by sudo

If you’re receiving errors along the lines of:
456 Address temporarily unavailable.
Or
maildrop: maildir over quota.
Or
status: deferred

It’s likely that the mail box in question has run out of storage space. In my experience this is usually due to the user not sorting through their emails and deleting old ones, and in one case there was hundreds of megabits in the .Trash folder that had simply not been emptied.

In order to increase the mail directory’s quota for the user in question, you need to run the following command from inside the users directory:
maildirmake -q 500000000S,1000000C ./Maildir

this will allow 500MB of Storage to be used, or 1000000 Correspondents (or emails) to be stored on their mail directory, whichever comes first.

After running this command you’ll need to restart the mta
/etc/init.d courier-mta restart

and make sure that the cache of problem email addresses is cleared:
courier clear all

Filed Under: Guides, Technology Tagged With: Courier MTA, Mail Quota

Simple Single Interface Squid Proxy

2011/01/07 by sudo

This is a brief introduction to Squid proxy that covers Setting up Squid on Ubuntu/Debian with a single network card, and routing traffic through it. I use this for testing websites remotely before putting them live, but it would also be possible to setup a similar box to monitor network traffic.

Before you begin you should have a computer setup with Ubuntu/Debian. I am using Ubuntu 10.04 LTS for my proxy. The rest of these instructions will assume that this is installed and configured on your network as you want it. Also note that Debian users should su to root instead of typing sudo.

1. Get squid

First thing’s first, we need to get squid. This can be achieved by typing the following into  the terminal:

sudo apt-get install squid

This will download squid from the repositories and set it up on your machine.

2. Setup Squid

This is easy if you know what you’re doing. The squid config file is in the /etc/squid/ directory, so open it up in a text editor:

sudo nano /etc/squid/squid.conf

Now we need to add the lines to allow access to your network. If you’re in nano you can use

Ctrl+w and type acl all src all

Since I’m setting this up as a remote server I have to allow my IP address for work to be able to access the proxy. Here’s the lines I added:

acl remotenet src 123.123.123.123 #Work’s external IP address

If you don’t know your external IP address go to: sirnet.co.uk/ip.php

further down the script there’s a line that reads http_access allow localnet, under this line add:

http_access allow remotenet #allow connections to this proxy server from “remotenet”

Understanding this process

ACL is an access control list, which you’re assigning IP addresses to. in the above two lines you’ve allowed the IP address 123.123.123.123 access to the proxy server. You can continue adding IP addresses to the remotenet group my copying and pasting the first line you entered but with different IP addresses.

At the end to the file you’ll also need to add the following line of code:

visible_hostname someProxyServerName

Exit and save the squid.conf file by pressing Ctrl+x and pressing Y when asked if you want to save changes, followed by enter to overwrite the file.

restart squid: sudo service squid restart

3. Setting  Up Your Web Browser

You should now be able to access your server by setting the proxy details up in your browser as the following:

Proxy: your squid IP address/DNS name

Port: 3128

Filed Under: Guides, Technology Tagged With: proxy server, squid

  • « Previous Page
  • 1
  • …
  • 3
  • 4
  • 5
  • 6
  • Next Page »

Recent Posts

  • System Hang on Ubuntu 24.04 “e1000_print_hw_hang”
  • Disable iLO on HP Microserver Gen8
  • Ubuntu Desktop 24.04 Change Wallpaper Settings
  • Customising Ubuntu Desktop 24.04
  • Remove domains from Let’s Encrypt using Certbot

Tags

API auditing crochet data recovery debian debudding development Dingo API docker email Getting started with Laravel 5 & Dingo API hard drive health HP Microserver KVM Laravel larvel 5 lenovo Linux Minion mint netgear nas networking network shares php PHP development Postfix raspberry pi review samba security SMART smartctl smartmontools smb testing traefik ubuntu ubuntu 18.04 ubuntu 20.04 ubuntu 22.04 ubuntu server vagrant Virtual machines xdebug xubuntu

© Copyright 2015 apt get life