Jack of All Blogs https://jackofallblogs.com Ins and Outs of Running A Blog Network Wed, 23 Nov 2016 09:15:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 Google Tips to Secure Yourself and Your Sites Online https://jackofallblogs.com/networking/google-tips-to-secure-yourself-and-your-sites-online/ Thu, 07 Feb 2013 05:20:55 +0000 http://jackofallblogs.com/?p=355 It pays to be safe not only offline but even online. As scams, frauds and identity thefts continue to exist on the web, every internet user needs to be vigilant and take the necessary steps to protect his privacy. a0073-000091 Google marked the Safer Internet Day on Tuesday, February 5, to raise awareness on the ways people can protect their personal information online. Unaware to some of you, Google launched a site called Good to Know in 2012 as a venue for promoting its educational campaigns particularly on staying safe online. The site also provides guidelines on how to best secure your computer and mobile devices. Emails and even blogs today are not spared from hacking and Google definitely has some surefire tips to help you deal with these issues. The search engine giant revealed it identifies more than 10,000 unsafe websites on a daily basis and they are taking action to keep internet users informed. Google said it displays warnings on up to 14 million Google search results as well as 300,000 downloads informing users of something suspicious in the site or link. To ensure that your sites, blogs, social media and other accounts do not get hacked, here are simple yet very important steps you can take moving onwards.

Password

When creating passwords, use long ones that contain numbers, letters and symbols. Do not share them with others and avoid sending them via email. There’s also a way now to set up your password recovery options. This can be done through your mobile phone, another email address and through the security question feature.

Content

Do report and flag content that you feel is illegal or contains abusive information. You can also adjust your privacy settings with regards to how you want to receive and share content.

Browser

Be sure to update your browser and operating system as well. Always check your source, though, to avoid virus and malware attacks.

Web Address

Most often, we’re asked to sign in before entering a site such as social media and online banking services. To ensure that you’re connecting safely, check that the web address starts with https:// which means the site is encrypted. Photo via geekersmagazine

]]>
Why We Don’t Worry About Every Scraper https://jackofallblogs.com/networking/why-we-dont-worry-about-every-scraper/ Tue, 30 Nov 2010 21:09:54 +0000 http://jackofallblogs.com/?p=67 [dropcap]I[/dropcap]n addition to being a writer for Splashpress, including on BloggingPro, Freelance Writing Jobs and now Jack of All Blogs, one of my responsibilities for the company is copyright and plagiarism enforcement. I help monitor where Splashpress Media content is being used and, when appropriate, secure its removal or at the very least its banning from the search engines.

While these are all jobs I do routinely as part of my job as a copyright and plagiarism consultant, I’m happy to say that Splashpress has adopted a practical policy on scrapers that not only allows them to enforce their rights, but prevents them from having to pursue every single case of infringement, regardless of how unimportant.

The company recognizes that, as an organization with over 100 sites and 200 people, that it is impractical, especially on a reasonable budget, to target every single scraper or spammer that wishes to misuse their content, let alone every attributed reuse by a human.

As such, the company carefully targets those that it goes after, using its resources to focus on those who cause the most damage and create the most headache.

While it is a relatively simple process, understanding it requires a basic grasp of how the search engines parse duplicate content and why not every case of infringement is action-worthy. Most importantly, it involves understanding tips that can show any blogger how to use the spammers to their advantage and turn duplicate content into free advertising.

Understanding Duplicate Content

As I recently explained on BloggingPro, duplicate content is when the same or very similar content appears on multiple pages. It can be pages on the same site or pages across different domains.

Search engines don’t like this because they want to showcase a wide array of original material with every search result. As such, they do their best to detect duplicate content, determine which URL is the best and/or original and then penalize the others with the same content. To do this, the search engines use a variety of tactics including looking at which URL was posted first and seeing which page has the most inbound links.

Fortunately, Google does a decent job of detecting which page is the original. Though spammers will scrape the contents of a site, usually using the RSS feed, and republish it on their blogs, they rarely fool the search engines. With few inbound links and URLs that appear hours after the fact, they have a hard time fooling the search engines into thinking they are the original work.

This doesn’t mean it can’t or doesn’t happen, which is why I do occasionally have to step in on rare occasions. However, realizing that most scrapers aren’t hurting the sites they lift from, no matter how hard they try, frees up SplashPress to focus more on creating new, interesting content and entertaining/informing its readers.

It’s a win/win but it wouldn’t be possible without a few additional steps.

Preventing Scrapers from Mattering

Dart Board PictureJust because most scrapers and other plagiarists don’t usually impact their original sites doesn’t mean it can’t or won’t happen. As such, we take proactive measures to prevent that from happening and decrease the chances of them hurting our sites.

One of the key steps we take is an internal linking editorial policy. Almost every article, when appropriate, has a link to a different relevant article on the same site or, if one isn’t available, another Splashpress property.

The reason for this is simple, in addition to directing readers to relevant content elsewhere on the network, these links are also picked up by spammers and republished. Those links, in turn, become valid, inbound links that search engines pick up and place value on.

This has two very important effects:

  1. Increases Page Ranking: By generating more inbound links, the spammers are actually helping the sites involved rank better. Though the importance of links from spammers is likely minimal, it’s still a help and, given the amount some of our content is lifted, can be quite powerful when spread across so many sites.
  2. Prevents Duplicate Content Penalties: Second, by linking to the original site, the spammer is essentially “voting” for the original site and search engines see that and weigh it when determining which version is the original. This helps ensure the original version isn’t accidentally penalized as a duplicate.

While the system isn’t perfect, it has served us well, making it so that well over 99% of all spammers can be safely ignored.

Still, every once in a while a spammer or scraper gets lucky and starts ranking well with our content. In those cases, that is where I step in.

Dealing with Outliers

In the rare cases where a spammer does manage to start causing harm to the original sites, we do have an action plan in place and it closely mirrors the Stopping Internet Plagiarism guide that I have written on my main blog, Plagiarism Today.

Basically, the system consists of first trying to contact the scraper, which is rarely possible, and then filing a takedown notice with the site’s host to get the entire domain removed. If that fails, we then file a similar notice with the search engines to get it removed from the indexes of the search engines which, while it doesn’t remove the site from the Web, at least prevents it from competing with the original sites with their own content.

To date, we’ve been able to resolve every damaging case of infringement without much problem. However, that wouldn’t be possible if we tried to stop every single one as, without the ability to focus on those that were actively hurting us, we probably wouldn’t be able to ensure resolution.

All in all, the need to take such action is very rare, a couple of cases per month at the most, but it is necessary and we are prepared. However, it isn’t nearly as necessary as many think that it is.

Bottom Line

In the endl, content management and enforcement is an important part of any site’s business. However, that isn’t the same as stamping out every unauthorized copy that exists. Not only is that impractical, but it is a tremendous waste of resources.

Managing your content is much more than simply removing works, it involves understanding how your content is being used, encouraging useful and beneficial copying and dealing with harmful ones.

Fortunately, this is something that Splashpress Media does very well and is something that I’m very proud to be a part of.

My hope is that others will understand this as well so we can all work on spending our energies on what’s really important, creating good content and promoting it well.

]]>