Category > HTML Stuff

Explore Your Back-Links. Here’s How! — Fili Wiese

admin » 19 January 2010 » In HTML Stuff » Comments Off

Check your back-links as Google knows them

There are at least three ways how you can explore your back-links in Google:

Google Webmaster Tool (specifically the Links to your Site-section) gives the best and most complete overview on how Google sees your back-links. However it is important to note that only the webmaster/owner of a website who has been verified can see these back-links. For your convenience you can even download all the links in a spreadsheet format and process them on your own computer.

The Link-operator in Google Web Search (link:yourdomain.com) shows anyone a number of the back-links of any website. It is very important to note that this overview is just a sample from a much larger set of back-links. The Link-operator in Google Web Search will never give you the total overview, just a number of back-links and even the total of this number can change every time you submit your query to Google.

Since the Link-operator only gives you a limited view of back-links, several people have found that you get a better (and more complete) sample of back-links with the following query: “yourdomain.com” -site:yourdomain.com. So let me translate this query for you: the first part means you are looking for any reference to yourdomain.com (note that the quotes are important), and the second part means that you want to ignore any reference to yourdomain.com from the domain yourdomain.com (more information on the Site-operator and other advanced operators can be found here). If a SEO or competitor performs an analysis of the back-links to your website, and does not have access to your Google Webmaster Tool account, then this will be one of the queries they will perform.

via Explore Your Back-Links. Here’s How! — Fili Wiese.

Continue reading...

Regular Expressions Cheat Sheet

admin » 02 June 2009 » In HTML Stuff » Comments Off

Regular Expressions Cheat Sheet
A one-page reference sheet. It is a guide to patterns in regular expressions, and is not specific to any single language. Available in PDF and PNG.

Screenshot

Continue reading...

Get homepage images without a custom field

admin » 17 May 2009 » In HTML Stuff » Comments Off

Get homepage images without a custom field

When the first magazine themes arrived a couple of years ago, custom fields were the big thing that drove them. Trouble is, everyone hates filling them out. Thankfully, it is very easy to circumnavigate the need for custom fields with a piece of functions.php code. We’ll also be resizing the image, using phpthumb, the which was used in the example above.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
 
// Get URL of first image in a post
 
function catch_that_image() {
 
global $post, $posts;
 
$first_img = '';
 
ob_start();
 
ob_end_clean();
 
$output = preg_match_all('//i', $post->post_content, $matches);
 
$first_img = $matches [1] [0];
 
// no image found display default image instead
 
if(empty($first_img)){
 
$first_img = "/images/default.jpg";
 
}
 
return $first_img;
 
}

All that is left to do is display the image on the homepage, which we can do with the following code:

1
<img src="<?php bloginfo('template_url'); ?>/phpthumb/phpThumb.php?src=<?php echo catch_that_image() ?>&w=200" alt=""/>

The image will be resized to 200 pixels wide.

Source – WordPress Support Forums

via 10 tricks to make your WordPress theme stand out.

Continue reading...

50 sites to find free stock images

admin » 13 May 2009 » In HTML Stuff » Comments Off

FreeFoto.com : Lots of images, orgnized in different galleries.

Dexhaus : Good site with excellent photos.

Kavewall : images and textures.

Digital Dreamers : Different falleries.

StockVault : Very well-known, and very good of course.

FreePhotosBank : Good choice.

via 50 sites to find free stock images.

Continue reading...

10 Easy Ways to Secure your WordPress Blog

admin » 13 May 2009 » In HTML Stuff, Wordpress SEO Plugins » Comments Off

1. WP Security Scan

This very easy to use plugin will sort out some of the basic security issues with WordPress – it’ll change your database’s name and alert you to flaws in your installation’s security, amongst other features.

Download.

2. Protect your plugins

Plugins are an easy way for a hacker to get access to your blog if they’ve got flaws in them. An easy way for hackers to find out which plugins you’re using is to go to /wp-content/plugins/, and they’ll find all the plugins that you’re using. The solution? Put a blank index.html file in the wp-content/plugins/ folder.

via 10 Easy Ways to Secure your WordPress Blog.

Continue reading...

The Year of Original Content: How to Fight Back Against Abusers | The Blog Herald

admin » 11 May 2009 » In HTML Stuff » Comments Off

Protest Loudly Against Spam Blogs to Those Who Host Them

Report specific scam, scraper, and spam blogs to those who host them. Blogging about it won’t change anything. Go to the sources and those who host them.

Demand Google’s Blogspot/Blogger or WordPress.com, and other blog and social media hosting services to clean up their sites and remove all spam blogs and help you fight back against copyright infringements and plagiarism. It’s their responsibility, so remind them.

via The Year of Original Content: How to Fight Back Against Abusers | The Blog Herald.

Continue reading...

Prevent content theft: Copyscape / Fair Share / Tracer

admin » 24 April 2009 » In HTML Stuff » Comments Off

Copyscape

image

You can call Copyscape a ‘content theft search engine’. All you have to do is go to the site and type in your blog’s URL, and it scours the internet for websites and blogs that have either stolen or quoted your blog’s content. You can then decide on further course of action by contacting the plagiarist or the web host. Copyscape also provides a premium account using which you can do unlimited searches, copy and paste a block of text to look for its copies on the web, etc.

Visit Copyscape

Fair Share

image

Most of blog content theft these days happens using automated RSS Feed scraping. You can go to Fair Share and type in your blog feed URL. Their engine searches for plagiarised content and provides you with an RSS feed that links to the scrapers’ sites, so that you can keep an eye on the plagiarists from your feed reader itself.

Visit Fair Share

Tracer

copy paste

Tracer is a pretty aggressive way to track content theft and it works only after you install its gadget or code in your blog. It then tracks user activity just like an analytics program, looking for copy paste activities on your blog. When somebody copies your content and pastes it, the pasted text is automatically accompanied by a link to your blog. Have a look at the screenshot above for an instance where the pasted text is followed by the source link.

Visit Tracer

Continue reading...

A Deeper Look At Robots.txt

admin » 20 April 2009 » In HTML Stuff » Comments Off

Robots.txt syntax

* User-Agent: the robot the following rule applies to (e.g. “Googlebot,” etc.)

* Disallow: the pages you want to block the bots from accessing (as many disallow lines as needed)

* Noindex: the pages you want a search engine to block AND not index (or de-index if previously indexed). Unofficially supported by Google; unsupported by Yahoo and Live Search.

* Each User-Agent/Disallow group should be separated by a blank line; however no blank lines should exist within a group (between the User-agent line and the last Disallow).

* The hash symbol (#) may be used for comments within a robots.txt file, where everything after # on that line will be ignored. May be used either for whole lines or end of lines.

* Directories and filenames are case-sensitive: “private”, “Private”, and “PRIVATE” are all uniquely different to search engines.

Let’s look at an example robots.txt file. The example below includes:

* The robot called “Googlebot” has nothing disallowed and may go anywhere

* The entire site is closed off to the robot called “msnbot”;

* All robots (other than Googlebot) should not visit the /tmp/ directory or directories or files called /logs, as explained with comments, e.g., tmp.htm, /logs or logs.php.

via A Deeper Look At Robots.txt.

Continue reading...

 Page 1 of 3  1  2  3 »