Avoiding SEO Penalties in 2017

SEO penalites are there to stop website owners from abusing the system in order to get their site further up the search rankings.  Years ago, techniques used included keyword stuffing, buying loads of links (regardless of the relevancy) and unnatural onsite optimisation.  Google introduced a number of algorithms to attempt to reduce the poor quality sites positions, whilst increasing the good sites.

Penguin:  Penguin was an algorithm introduced in 2012.  It focusses on the way sites build their links, penalising any sites which appear to use dodgy techniques.  These techniques include guest posting on irrelevant blogs, keyword stuffing (using your keyword weirdly too many times) and over optimisation of anchor text.  In order to avoid a penguin related penalty, you should always get rid of any poor quality or spammy backlinks, stop keyword stuffing, place your keywords in more natural positions and use the anchor text more naturally.

Panda:  The first major algorithm update, from around 2011.  The aim of panda was to rid the world of crappy websites and make it easier for decent sites to climb the rankings.  High quality sites, according to google, should be written by experts (the content that is), have no spammy backlinks, no duplicate content (content which is copied across a site or page), good spelling and grammar and be trustworthy (ie visitors are safe to enter credit card details etc).  To avoid a panda related issue, websites need to earn good quality backlinks through posting good content which is relevant and interesting to the reader, whilst being totally unique.