Keeping Your Website’s Copyright Notice up-to-date

Share

I am often amazed at how some websites don’t manage to update the copyright notice at the bottom of the page to the current year. As I write this, I have hundreds of pages that have a copyright notice and in about 12 hours from now (since today is December 31st), they will automatically update to the year 2013 without me having to do anything. I noticed that when a new year arrives that for several months many websites will have the latest copyright date set to last year, and in some cases, they can be years out of date. When I see that, it makes me feel that someone is asleep at the switch.

Google uses more than 200 ‘signals’ to measure the influence of a page, and it would surprise me if they didn’t include the latest date in a copyright notice as something that might be of interest. Fresh content is king, and stale content is like two-day old bread and that’s why the bots never stop crawling your content. But to see a copyright that is years out of date on content that has obviously been updated today, well, it makes the site owner look like he’s just not paying attention to the details.

So, how does my copyright update without my intervention? Well all of my pages are sent through a PHP parser, and my copyright notice has this little snippet of PHP code:

Copyright &copy; 2001-<?php echo date(Y); ?> Lee Devlin

When it’s run through the PHP parser and rendered on a browser, it translates to:

Copyright © 2001-2024 Lee Devlin

I have that code in all my footers, both on my WordPress and static pages, and it updates the second date every year, without me having to think about it.

If you want to edit your footer, WP themes generally keep the code in a file called footer.php in the wp-content folder. Feel free to copy that code into your footer, you have my permission. Just remember to change it to your name ;-).

Does Google Throttle Search Impressions?

Share

A few months ago, I read an excellent posting by Lisa Irby about how to use Google’s Authorship feature. This feature was also mentioned in a blog posting on Google’s Webmaster blog as way to see your search impressions and clicks on articles for which you claimed authorship.

I had been curious about how some bloggers were getting their faces to appear next to their articles in Google’s search results so I was eager to give it a try. Upon following Lisa’s instructions, it appeared to do the job of claiming authorship for me, but my image was not appearing next to the results. The image I had on my Google Plus account at the time (and many other places for that matter) was of myself sitting in my LongEZ like the image shown to the right. Lee in his LongEZI began to ponder whether this was part of the problem since Google specifically requests a ‘recognizable’ headshot and I’m not very recognizable in that photo. I noticed that was a consistent feature of the faces that were appearing next to the articles, i.e., it was a headshot, not an icon or a image of their whole body. It was almost like Google was running the image through a facial recognition algorithm and if there wasn’t a recognizable face, then nothing would appear. So on my last visit to PA, I asked my brother-in-law, Jay Yozviak, Northeastern PA’s premier photographer, if he’d take a professional head shot for me. Whenever I tried to take a headshot of myself, it ended up looking like a mugshot, or at best, a namebadge ID photo. It seemed like no sooner did I upload my pro headshot to my Google Plus account than it started appearing next to my articles in a Google search.

Google search results with author photo

This is how a Google search that finds one of my articles shows up now.


Later, I began to look at the number of impressions that were appearing in Google’s Webmaster Tools under Labs/Authorship stats and I noticed a graph that appeared to be bumping up some sort of ceiling of 8000 impressions per day. (You can click on the image below make it bigger.)

My website is hosted with GoDaddy on an IP address that also resolves to many other websites and it’s no speed demon. Sometimes it loads very fast, other times it can be downright slow. But I can’t complain about the price. For $7/month, they allow 150GB of storage and unlimited websites and bandwidth. On domaintools.com, I see that my IP shows over 4500 websites resolve to that server. Now, I expect that it’s not a single server, but a bank of servers that automatically do load balancing and all kinds of other cloud-like behavior, but I know that when Google was measuring my performance in the Webmaster’s tools, my site loaded slower than 80% of other sites and I can’t help wonder if Google intentionally will back off on the search impressions they show based on trying to keep the target host from getting overloaded if they detect a slow server.

The line across the bottom is the 20% percentile for all websites monitored. Therefore, my site with an average page load time of 5.8 seconds is slower than 80% of all sites.

Google’s all about speed, and if they find a slow server, they may just give it some kind of throttling in search impressions so as not to provide a bad customer experience. I’ve tried all the various tricks to make the site faster, caching, various pagespeed recommendations, db optimization, etc., but sometimes the site still takes a while to respond and load.

Going with a dedicated server costs a minimum of $100/month and I don’t know if there’s any guarantee that a single dedicated server will perform any better. And of course, that unnatural looking ceiling may be completely natural. There may be exactly that many searchers for the kind of results they’d find on my website every day for weeks at a time, but it just looks suspicious. In looking at the result for Matt Cutts’s blog which was used in an example in the posting mentioned above, he sees a much more varied number of impressions for his content over time, by a factor of two at least, without any apparent ceiling even though he’s getting 10x the traffic I am getting. But he’s also on a dedicated server. And.. He’s Matt Cutts :-).

I can’t say for sure what’s going on here, but I know what clipping on an analog signal looks like and I’d say there is some kind of clipping going on with my number of impressions per day. I also understand that my graph is rounded off to the nearest 500 impressions, which can add quantization effects and make a line graph look unnatural as well. But I’ve also see a graph of someone who has fewer impressions per day than I do, and Google appears to resolve the impressions per day to fewer than 500 and I don’t see any clipping going on there. To me it looks very ‘natural’.

This graph was from someone on an seochat forum complaining about getting too few impressions per day, but I don’t know how he was able to deduce any pattern from the data, other than it never got above 2500.

Maybe Google is trying to do me, my hosting company, and my visitors all a favor, by limiting traffic to a reasonable volume. But if they are doing it, it would be nice to know about it for sure. And if you know how I can make a GoDaddy.com host (or shared/cloud server) that has thousands of other sites hosted on it run faster without having to pay 1400% more per month for a dedicated server, please leave a comment below.

Backing up a WordPress Blog on GoDaddy

Share

Blogging is a great way to prepare content for the Internet without spending a lot of time worrying about the details of website administration and content formatting. But it’s easier to put content in a blog than it is to get it out of the blog for backup purposes. WordPress has a way to export the postings and comments into an XML file for local storage, but since you have to remember to do it periodically, you’re likely to lose a few postings if something goes awry on the server and you haven’t done a backup for a while. And the XML export doesn’t save the images you may have uploaded to your web host because those are not stored in the same database as the posts and comments.

I started looking around for a WordPress backup solution and was unable to find anything that looked like a good fit. WordPress posts and comments are not located on your GoDaddy web host. They are on a separate host/database that GoDaddy sets up when you install WordPress as an application. I download my website’s changed files to my local PC using a scheduled WS-FTP session. I generally use FileZilla for FTP and would like to use it instead of WS-FTP, but it doesn’t have a way to automate periodic downloads so I’m forced to use WS-FTP for my backups. WS-FTP isn’t free, but until FileZilla supports recurring automated scheduling of uploads/downloads, it’s the only way I know to do this. The FTP backup takes care of downloading my blog images and other website files but not the actual blog postings or comments.

GoDaddy has a web interface to set up cron jobs and I was pretty sure I could back up my blog’s database to the same WordPress directory that stores my blog’s theme and php files using a script to dump the database. A database dump could put a copy of the postings and comments in SQL format in a location where my WS-FTP job could find it and download it on a regular basis.

I figured others would like to know how to do this, so I’ve written up a short ‘how to’ here.

I will assume you have are using GoDaddy’s Linux hosting and have an SSH login and know your way around a Linux system. If not, you’ll need some help from a Linux expert who can understand the instructions below.

First of all, you can use a single command to perform the WordPress database backup manually, to make sure it’s working. Here is an example of mine which I store in a file called “wpbackup” located in the wordpress directory (please note, this command should all be on a single line):

mysqldump --add-drop-table -h mysqlhostname -u mysqlusername -pmysqlpassword mysqldatabasename | gzip -c > $HOME/html/wordpress/yourblogbackup.sql.gz

If you can’t remember the mysql hostname, username, password, and databasename because GoDaddy generates them for you automatically when you install WordPress as an application, you can find them all in the /wordpress/wp-config.php file. Please note that there is no space between the -p and the password. All other spaces are required. The mysqldump command takes all the data from your wordpress blog database and puts it in a SQL format that allows you to re-import it should the need arise. The “| gzip -c” compresses the database since SQL is made of plain text and it compresses pretty well, probably 4:1 or better. If you don’t want to use it, you can leave it out.

You will need to confirm that this is working properly. Just make sure it’s executable and type in “./wpbackup” to run it. Then you can do a “gzip -d” on the yourblogbackup.sql.gz file to turn it back into sql statements so you can browse the contents of the output file. Once you’re sure it’s working, then you are ready to set up the cron job. It is possible set up a cron job manually by editing the crontab file in your home directory, but GoDaddy has a web interface that allows you to set up and manage cron jobs without having to know how to edit the crontab file directly. You can see the result in the crontab file by looking at it in your home directory after you set it up if you’re curious.

Just go to the Hosting Control Center -> Content -> Cron Manager

Set up the job to make the wpbackup script execute on a regular basis. It should be done frequently enough to insure that your FTP download is getting a recent backup of the database.

With the cron job executing on a regular basis, and a scheduled FTP download of your website, the most recent content from your blog will get backed up so that should misfortune strike, you’ll be able to restore it to its original condition. The two commands to do that are:

gzip -d yourblogbackup.sql.gz
mysql -h mysqlhostserver -u mysqlusername -pmysqlpassword mysqldatabasename < yourblogbackup.sql

(please note, the mysql command should be all on one line)

If you found a better or easier way to backup your WordPress blog, please leave a comment with a link to your solution.
WP Starter Guide

Blog by Hugh Hewitt

Share

I am reading a book entitled ‘Blog‘ by Hugh Hewitt. It’s a good book, primarily focused on blogging from a political standpoint and how it’s affecting mainstream media. One of the ironies is that Hugh’s own website, although it is organized like a blog, appears to be hand-edited HTML. Blogging software includes important features now such as permanent links, comments, XML feeds, and archiving. Whenever I visit a blog website, I am always curious to see what tool is being used, such as Blogger, Movable Type, WordPress, MyST, etc., but when I looked at Hugh’s source, there didn’t appear to be any of that. I supposed he might just send the text to some smart webmaster who figures out how to add the links and insert it into the mix. I use FrontPage to edit my main web pages, but that would be very tedious for maintaining a blog, especially if I tried to do the archiving, XML feed, comment fields, and permanent linking manually.

I find personal blogs can be somewhat like the Christmas letters I get each year. I really like reading those letters, even though they are not personalized, because they help me to keep abreast of what’s happening in someone else’s life. Usually, they mention the places they’ve traveled, events they attended, job changes, accomplishments, graduations, and other personal topics. Phil Greenspun sent out a few Christmas letters that were very funny in 1990 and 1991, but apparently gave up the practice. His letters were a sort of self-deprecating satire on the narcissism involved of sending those letters. I guess there is something narcissistic about sending out letters that read almost like press releases, but I still like getting them. At least with blogs, people can always ‘opt out’ if you’re boring or annoying them ;-). I noticed that Phil maintains a blog now, so perhaps there’s not as much need to keep people apprised of his exploits with a yearly letter.

Another reason to maintain a blog is to beome a ‘thought leader’ on a topic. My friend Jack Krupansky is a thought leader on the topic of software agents, and recently started 5 blogs on various topics. I suppose if you have a company, or are establishing yourself as an authority, a blog dedicated to a particular subject matter is a good way to keep people coming back to you for your advice. Uncensored blogging strikes fear into the hearts of PR departments which like to scrub everything before it gets released to the public, but it’s hard to sound genuine in those types of postings. It comes across sounding much more like a press release than what the person really thinks. GM is starting to blog (and podcast too!), and if they can do it, how much longer will it be before the rest of the Fortune 500 follow suit? Warren Buffet doesn’t blog, per se, but his annual letters to shareholders have the same ‘flavor’ of a blog entry, full of humor, good information, and no pesky PR department’s censoring imprint on them. I guess when you’re one of the richest men in the world and have held your current position for 40 years, you don’t have to ask permission from others about what information you can share.