Posted on December 31st, 2012 2 comments
I am often amazed at how some websites don’t manage to update the copyright notice at the bottom of the page to the current year. As I write this, I have hundreds of pages that have a copyright notice and in about 12 hours from now (since today is December 31st), they will automatically update to the year 2013 without me having to do anything. I noticed that when a new year arrives that for several months many websites will have the latest copyright date set to last year, and in some cases, they can be years out of date. When I see that, it makes me feel that someone is asleep at the switch.
Google uses more than 200 ‘signals’ to measure the influence of a page, and it would surprise me if they didn’t include the latest date in a copyright notice as something that might be of interest. Fresh content is king, and stale content is like two-day old bread and that’s why the bots never stop crawling your content. But to see a copyright that is years out of date on content that has obviously been updated today, well, it makes the site owner look like he’s just not paying attention to the details.
So, how does my copyright update without my intervention? Well all of my pages are sent through a PHP parser, and my copyright notice has this little snippet of PHP code:
Copyright © 2001-<?php echo date(Y); ?> Lee Devlin
When it’s run through the PHP parser and rendered on a browser, it translates to:
Copyright © 2001-2014 Lee Devlin
I have that code in all my footers, both on my WordPress and static pages, and it updates the second date every year, without me having to think about it.
If you want to edit your footer, WP themes generally keep the code in a file called footer.php in the wp-content folder. Feel free to copy that code into your footer, you have my permission. Just remember to change it to your name ;-).
Posted on October 25th, 2012 No comments
A few months ago, I read an excellent posting by Lisa Irby about how to use Google’s Authorship feature. This feature was also mentioned in a blog posting on Google’s Webmaster blog as way to see your search impressions and clicks on articles for which you claimed authorship.
I had been curious about how some bloggers were getting their faces to appear next to their articles in Google’s search results so I was eager to give it a try. Upon following Lisa’s instructions, it appeared to do the job of claiming authorship for me, but my image was not appearing next to the results. The image I had on my Google Plus account at the time (and many other places for that matter) was of myself sitting in my LongEZ like the image shown to the right. I began to ponder whether this was part of the problem since Google specifically requests a ‘recognizable’ headshot and I’m not very recognizable in that photo. I noticed that was a consistent feature of the faces that were appearing next to the articles, i.e., it was a headshot, not an icon or a image of their whole body. It was almost like Google was running the image through a facial recognition algorithm and if there wasn’t a recognizable face, then nothing would appear. So on my last visit to PA, I asked my brother-in-law, Jay Yozviak, Northeastern PA’s premier photographer, if he’d take a professional head shot for me. Whenever I tried to take a headshot of myself, it ended up looking like a mugshot, or at best, a namebadge ID photo. It seemed like no sooner did I upload my pro headshot to my Google Plus account than it started appearing next to my articles in a Google search.
Later, I began to look at the number of impressions that were appearing in Google’s Webmaster Tools under Labs/Authorship stats and I noticed a graph that appeared to be bumping up some sort of ceiling of 8000 impressions per day. (You can click on the image below make it bigger.)
My website is hosted with GoDaddy on an IP address that also resolves to many other websites and it’s no speed demon. Sometimes it loads very fast, other times it can be downright slow. But I can’t complain about the price. For $7/month, they allow 150GB of storage and unlimited websites and bandwidth. On domaintools.com, I see that my IP shows over 4500 websites resolve to that server. Now, I expect that it’s not a single server, but a bank of servers that automatically do load balancing and all kinds of other cloud-like behavior, but I know that when Google was measuring my performance in the Webmaster’s tools, my site loaded slower than 80% of other sites and I can’t help wonder if Google intentionally will back off on the search impressions they show based on trying to keep the target host from getting overloaded if they detect a slow server.
Google’s all about speed, and if they find a slow server, they may just give it some kind of throttling in search impressions so as not to provide a bad customer experience. I’ve tried all the various tricks to make the site faster, caching, various pagespeed recommendations, db optimization, etc., but sometimes the site still takes a while to respond and load.
Going with a dedicated server costs a minimum of $100/month and I don’t know if there’s any guarantee that a single dedicated server will perform any better. And of course, that unnatural looking ceiling may be completely natural. There may be exactly that many searchers for the kind of results they’d find on my website every day for weeks at a time, but it just looks suspicious. In looking at the result for Matt Cutts’s blog which was used in an example in the posting mentioned above, he sees a much more varied number of impressions for his content over time, by a factor of two at least, without any apparent ceiling even though he’s getting 10x the traffic I am getting. But he’s also on a dedicated server. And.. He’s Matt Cutts :-).
I can’t say for sure what’s going on here, but I know what clipping on an analog signal looks like and I’d say there is some kind of clipping going on with my number of impressions per day. I also understand that my graph is rounded off to the nearest 500 impressions, which can add quantization effects and make a line graph look unnatural as well. But I’ve also see a graph of someone who has fewer impressions per day than I do, and Google appears to resolve the impressions per day to fewer than 500 and I don’t see any clipping going on there. To me it looks very ‘natural’.
This graph was from someone on an seochat forum complaining about getting too few impressions per day, but I don’t know how he was able to deduce any pattern from the data, other than it never got above 2500.
Maybe Google is trying to do me, my hosting company, and my visitors all a favor, by limiting traffic to a reasonable volume. But if they are doing it, it would be nice to know about it for sure. And if you know how I can make a GoDaddy.com host (or shared/cloud server) that has thousands of other sites hosted on it run faster without having to pay 1400% more per month for a dedicated server, please leave a comment below.
Posted on September 22nd, 2009 2 comments
Blogging is a great way to prepare content for the Internet without spending a lot of time worrying about the details of website administration and content formatting. But it’s easier to put content in a blog than it is to get it out of the blog for backup purposes. WordPress has a way to export the postings and comments into an XML file for local storage, but since you have to remember to do it periodically, you’re likely to lose a few postings if something goes awry on the server and you haven’t done a backup for a while. And the XML export doesn’t save the images you may have uploaded to your web host because those are not stored in the same database as the posts and comments.
I started looking around for a WordPress backup solution and was unable to find anything that looked like a good fit. WordPress posts and comments are not located on your GoDaddy web host. They are on a separate host/database that GoDaddy sets up when you install WordPress as an application. I download my website’s changed files to my local PC using a scheduled WS-FTP session. I generally use FileZilla for FTP and would like to use it instead of WS-FTP, but it doesn’t have a way to automate periodic downloads so I’m forced to use WS-FTP for my backups. WS-FTP isn’t free, but until FileZilla supports recurring automated scheduling of uploads/downloads, it’s the only way I know to do this. The FTP backup takes care of downloading my blog images and other website files but not the actual blog postings or comments.
GoDaddy has a web interface to set up cron jobs and I was pretty sure I could back up my blog’s database to the same WordPress directory that stores my blog’s theme and php files using a script to dump the database. A database dump could put a copy of the postings and comments in SQL format in a location where my WS-FTP job could find it and download it on a regular basis.
I figured others would like to know how to do this, so I’ve written up a short ‘how to’ here.
I will assume you have are using GoDaddy’s Linux hosting and have an SSH login and know your way around a Linux system. If not, you’ll need some help from a Linux expert who can understand the instructions below.
First of all, you can use a single command to perform the WordPress database backup manually, to make sure it’s working. Here is an example of mine which I store in a file called “wpbackup” located in the wordpress directory (please note, this command should all be on a single line):
mysqldump --add-drop-table -h mysqlhostname -u mysqlusername -pmysqlpassword mysqldatabasename | gzip -c > $HOME/html/wordpress/yourblogbackup.sql.gz
If you can’t remember the mysql hostname, username, password, and databasename because GoDaddy generates them for you automatically when you install WordPress as an application, you can find them all in the /wordpress/wp-config.php file. Please note that there is no space between the -p and the password. All other spaces are required. The mysqldump command takes all the data from your wordpress blog database and puts it in a SQL format that allows you to re-import it should the need arise. The “| gzip -c” compresses the database since SQL is made of plain text and it compresses pretty well, probably 4:1 or better. If you don’t want to use it, you can leave it out.
You will need to confirm that this is working properly. Just make sure it’s executable and type in “./wpbackup” to run it. Then you can do a “gzip -d” on the yourblogbackup.sql.gz file to turn it back into sql statements so you can browse the contents of the output file. Once you’re sure it’s working, then you are ready to set up the cron job. It is possible set up a cron job manually by editing the crontab file in your home directory, but GoDaddy has a web interface that allows you to set up and manage cron jobs without having to know how to edit the crontab file directly. You can see the result in the crontab file by looking at it in your home directory after you set it up if you’re curious.
Just go to the Hosting Control Center -> Content -> Cron Manager
Set up the job to make the wpbackup script execute on a regular basis. It should be done frequently enough to insure that your FTP download is getting a recent backup of the database.
With the cron job executing on a regular basis, and a scheduled FTP download of your website, the most recent content from your blog will get backed up so that should misfortune strike, you’ll be able to restore it to its original condition. The two commands to do that are:
gzip -d yourblogbackup.sql.gz
mysql -h mysqlhostserver -u mysqlusername -pmysqlpassword mysqldatabasename < yourblogbackup.sql
(please note, the mysql command should be all on one line)
Posted on October 13th, 2001 No comments
Finally, the weekend has arrived. I managed to get the local chapter EAA newsletter out this week, but just barely made it. I try to get it out on time to notify the others in the group about the upcoming monthly meeting and thanks to the fact that most of them now have email, I can do it with less lead time than when I mailed each one out via postal mail. I feel like I’m writing it to myself sometimes since I’m not even sure it’s being read by others in our little EAA Chapter 1117 Website. I’ve thought of keeping some of the newsletters on-line, but can’t justify the space for old newsletters so I only keep the latest one. Writing a newsletter is a lonely business and judging by the number of folks who have abstained from volunteering to take over the job in the past 6 years I’ve had it, not a very sought-after role either.
I updated my guest book today so it’s not quite so generic, but am still waiting for my first entry.