Archives January 2019

Trending SEO News Stories: YouTube, Site Speed and Updates – Jordan Koene // Searchmetrics

Episode Overview: One reliable constant that keeps the SEO industry on its toes is SEO’s never ending variability. Keeping up with the endless changes is a huge task on its own on top of executing a successful marketing plan, but maintaining industry awareness is one of the best ways to stay ahead of the curve. Join Ben and Jordan as they launch Searchmetrics’ first breaking news episode where they aim to increase awareness of industry changes and updates as they discuss three breaking news stories impacting SEO.

Summary:

  • In the wake of the BERT update, OTA sites like Expedia and Orbitz continued to experience decreases over the course of the past month, while ecommerce sites that have experienced volatility are seeing an average 10% increase in visibility.
  • Google released a series of tools and resources designed to help users improve site speeds in Chrome, including an audit-like experience that gives users a snapshot view of overall Lighthouse performance scores.
  • YouTube’s restructuring of its website infrastructure to display more videos and categories has increased its visibility by 15% and is competitively fluctuating between the top two spots in visibility with Wikipedia.

GUESTS & RESOURCES:

Ben:                 Welcome to the Voices of Search podcast. I’m your host Benjamin Shapiro, and today we’re going to try something new and talk about three news topics in the world of SEO happening today. Joining us for our breaking news segment is Jordan Koene, who is the lead SEO strategist and CEO of Searchmetrics Inc, but before Jordan and I talk about what’s breaking in the SEO community, we want to remind you that this podcast is brought to you by the marketing team at Searchmetrics. We are an SEO and content marketing platform that helps enterprise scale businesses monitor their online presence and make data-driven decisions. To support you, our loyal podcast listeners, we’re offering a free trial of the Searchmetrics product. That includes the Searchmetrics suite to do all your keyword research and analysis, and Searchmetrics’ content experience tool which allows you to evaluate your content as it’s written. To start your free trial, go to Searchmetrics.com/freetrial. Okay, on with the news. Here’s my conversation with Jordan Koene, lead SEO strategist and CEO of Searchmetrics Inc. Jordan, welcome to the Voices of Search podcast.

Jordan:             Hey, Ben. This is going to be an interesting bumpy road here today.

Ben:                 Out of necessity comes creativity. We had a last minute cancellation, so we have a hole in our editorial schedule. You know what Jordan and I are going to do? We’re going to talk about some of the breaking topics in SEO community and maybe this is the best episode we ever did, or maybe we’re winging it and this ends up being a disaster. You guys and girls of the SEO community are going to have to let us know, but that said, we came up with three topics that we want to talk about. Jordan, what is the biggest breaking news in the SEO community this week?

Jordan:             As of today, I think it was like one o’clock in the afternoon, actually Google announced that they’ve been making several updates, but this is a pretty generic tweet. This happens not that frequently actually. It doesn’t happen that frequently, but it’s an interesting signal because there’s been a lot of rumors of a Google update in the past two weeks, that there’s been a big shift in rankings. We’ve seen some volatility in certain categories within our visibility, but it’s been unconfirmed by Google. Now Google is coming out and just saying, “Hey guys, yeah, there’s been a few changes,” but they’re not being super specific about what they are.

Ben:                 Let me get this straight. Google may or may not have made a change that impacts SEO’s business, but they’re not being very specific about what change was?

Jordan:             Yeah, shocking.

Ben:                 Get out of here.

Jordan:             I know, right? That’s why we have this podcast.

Ben:                 Well, hey, it keeps us in business.

Jordan:             Right.

Ben:                 Look, is this a rollback of some of the stuff that we’re seeing from BERT? Are they launching stuff and testing it and then saying, “You know what? That’s having too big of an impact.” Sometimes there’s a little give and then there’s a little take when there are these updates. What are you seeing from Google?

Jordan:             That’s a great question Ben, and candidly speaking, a lot of what I’m going to talk about is very anecdotal, but here’s what I can say. I don’t think anything that’s happened in the last two weeks has anything to do with BERT. BERT is really there to refine and clean up the relationship between the search query and the results that take place. I don’t really think there’s much of a rollback for BERT. What I do believe is going on is Google, subsequently off of the breaking news of BERT, is going back and doing some proper housekeeping and saying, “Hey, there’s a couple of sites that aren’t as performant as they should be. There’s a couple of sites that have technical debt. There’s a huge amount of websites out there that are flooding the index with garbage pages. Go clean up your garbage.”

Ben:                 Is this, in a potential update that’s happening, housekeeping after all of the change that’s happened, Google’s looking at the, hypothetically, looking at the performance and saying, “Look, there’s a pile of trash over here and that guy got screwed and this girl needs support and was unfairly impacted by the update so we’re just going to make a couple tweaks,” or is this actually something that is systematic and happening search-wide?

Jordan:             Yeah, great question. It’s not happening agnostically across everything. We’re seeing very much verticalization or site-specific changes. The reality is that we’re very much focused on specific sites. We know of a couple of our clients who are in the news and media space and they’ve made some massive improvements going from a Lighthouse score of 20 to a Lighthouse score of 60 in the last 2 months. These folks are seeing massive increases in search traffic. That’s not happened after BERT. That did not happen after the core update. That’s really happening right now, as of the last week. I feel like this is not a system-wide situation, but as you alluded to Ben, it’s a couple of corrections on the dial here and there, maybe due to BERT, maybe due to other things. They’re just seeing some anomalies or they’re coming back and they’re saying, “Hey, our core values are our performance. Our core values are our clean code. Our core values are a well-structured website. We’re going to correct and make sure that that’s still being adhered to.”

Ben:                 You mentioned that you saw a couple of industries that have been impacted, potentially related to BERT, potentially the three other updates that happened in like the previous, I don’t know, month or however long it was, the series of updates that happened between September and November. Who have you noticed getting impacted? You mentioned news and media. Anybody else getting helped or hurt by the recent changes?

Jordan:             Yeah, news and media being one of them. That’s been one that’s top of mind a lot. The other one is the travel category, which we look at as the OTAs, so the online travel agencies. These are websites like Expedia, Orbitz. We see like a system-wide like decrease across many of those sites over the last three or four weeks. It’s not just something that’s happened in the last week or two, but like we see a decline over the last month from many of these websites and it’s pretty much across the board. Almost everybody has seen some softness in that category. We’re also seeing some corrections here in the eCommerce space, so quite a bit of fluctuation taking place within the traditional eCommerce sites, eBay, Amazon, the Wayfairs all seeing a little bit of volatility over the last three to four weeks. The good news is that some of the more performant ones in the last week or two have seen a nice spike back up, 10% in visibility jump back up. That’s a promising sign if you have been investing in performance and you’re in eCommerce, that you can regain that traction.

Ben:                 Okay. We’re seeing a shakeup after the series of updates, some related to BERT, some previously not, a couple of industries that have really been impacted, and now we’re seeing Google go through and do some cleanup. That’s story number one, what’s happening during update season. Jordan, give me your second piece of SEO breaking news.

Jordan:             Yeah, so another piece of news that came out earlier this week, it just so happens to be on this same topic of speed, is that Google released some really useful, in my opinion, tools and resources around speed. They basically provided a really clear way of looking at Chrome and Chrome experience and the speed at which sites operate in the environment. They have created like a really neat audit-like experience that gives you a snapshot view of your overall Lighthouse performance score, accessibility, best practices, SEO score, tallies these up and gives you kind of like a breakdown of the actual areas impacted and what you need to clean up.

Ben:                 We’ve talked a lot about site speed and how important it is. As much as we might harp on Google and give them a hard time about not being transparent with some of the changes that are happening, on the update side, they have made it very clear that site speed is a major priority and that it is a ranking factor, even though Google doesn’t necessarily say things are ranking factors, but they’re actually providing a tool here to help the community understand how Google is evaluating, not necessarily how it impacts their SEO performance, but to give them some sort of benchmark to understand what is impacting their site speed.

Jordan:             Exactly. I mean, ultimately, like this resource is phenomenal. I mean, they’ve provided not just a measurement tool, like a real … very similar to the Lighthouse developer tool that existed in your Chrome plugin, this is a very accessible tool. For some of you who’ve been in this space for quite a while, you might remember the old PageSpeed Insights tool. This is kind of a mashup of the two of them. It’s a really easy, simple, quick way to understand what’s working and not working from a speed standpoint. Then it’s backed up by real learning materials so you can actually go in and learn about, hey, why am I having issues with my JavaScript? They have some manuals on those topics.

Ben:                 Jordan, tell me a little bit about how people can get access to the tool. Is it a plugin? Is it a website? Is it in webmaster tools? Webmaster tools … Is it in search console?

Jordan:             We may want to keep that, right? Yeah, go old school with webmaster tools. No, it’s very easy to access. It’s under the web.dev, D-E-V, domain, so web.dev. You go from there and you can quickly test your website or test a specific URL. You can find the blog and learning material. It’s like I mentioned earlier, it’s very similar to that PageSpeed tool and it just kind of throws everything up there for you and makes it super easy to access.

Ben:                 Jordan, tell me about the third breaking news story. What else is happening in the SEO world?

Jordan:             Yeah, so I think this is the interesting part of the episode here. These are great updates for everybody and I hope that you will go and take a look at some of the news out there on the recent Google fluctuations, but then also this speed tool, but one of the things that we’ve noticed here at Searchmetrics is this massive acceleration in YouTube. We actually wrote about it a few weeks ago, maybe like three or four weeks ago, when YouTube had almost a 15% jump in visibility and actually surpassed, may I say it, Wikipedia, in terms of overall visibility.

Ben:                 You can’t say Wikipedia, can you?

Jordan:             No.

Ben:                 Give me three Wikipedias.

Jordan:             It’s definitely a tongue twister, but here’s the real sad part about this is I’ve been, for nearly a decade, saying that Wikipedia is the biggest website in search and now I can’t say that and it just kind of kills me.

Ben:                 Because you literally can’t say it. It’s Wikipedia.

Jordan:             Wikipedia.

Ben:                 There we go.

Jordan:             Wikipedia is number two. Number two is not as cool as number one. YouTube has been fluctuating with Wikipedia for the last couple of weeks. Actually they just took over number one again. They were fluctuating in between one and two for a while. The crazy thing about this is that the biggest driver to this is twofold. The first one is that-

Ben:                 Google owns YouTube.

Jordan:             Well, there’s that.

Ben:                 Oh, sorry, are we not supposed to say that?

Jordan:             No, I mean, it’s pretty clear as day, right? This is actually the real interesting part. Even though Google owns them, it doesn’t necessarily mean that they’re always the winner. I’m curious to see how long YouTube stays in this position and owns this much market share in search, but the reality is that this is predominantly driven by YouTube’s own changes. They made core infrastructure improvements to the site. They expanded the amount of available videos on their homepage. They expanded the amount of links that can connect to content throughout their site. In so doing, it’s improved the crawlability and prioritization of videos and thus impacted their search results. I mean, I hate to say this, but Google’s eating their own dog food.

Ben:                 I think the secret sauce here is step number one, build a platform in a medium where it hasn’t been dominated online. Video, audio, we’re kind of running out of those opportunities, right?

Jordan:             Sure.

Ben:                 Step number two, get acquired by Google. Step number three, do a lot of really smart SEO stuff and refer to step number two.

Jordan:             Well, the crazy thing is that step number three is often forgotten by many things that have been acquired by step number two, which is acquired by eBay, or by Google, excuse me.

Ben:                 You mentioned that Google made some changes to their homepage. The thing that I saw that they changed on their homepage, they actually made the videos sort of at the top of the page above the fold a little larger. They’re basically giving more space to the featured videos. A lot of that’s breaking news type stuff. What else do you see that has been changed on the site that’s impacting search?

Jordan:             There are essentially three factors here. YouTube’s speed has always been top-notch, but they’ve continued to expand their capabilities here. Secondly, they’ve really exploited the amount of content that’s on their main home pages, category pages. You go to, say, like the gaming category cage, the amount of available options on there using their carousel experience has quadrupled. Before there may have been 40, 50 videos. Now we’re looking at nearly 100. This is a remarkable explosion in the volume of already prioritized videos for Google to consume. I think that’s the important piece there, already prioritize, because there’s so much stuff, there’s so much UGC that’s being pushed into YouTube. What YouTube’s main job is is to prioritize what should get into Google, because not everything should get into Google and not everything should be prioritized from a linking standpoint. That is what they’ve done very well.

Ben:                 I think the last thing I want to bring up is two very important numbers in the SEO community. One of them is 86 and the other one is 118. Jordan, can you tell everyone why numbers are relevant to the SEO community?

Jordan:             It’s not Lighthouse scores, is it?

Ben:                 It is not Lighthouse scores, Jordan.

Jordan:             Because that maxes out at 100. That maxes out at 100. Ben and I have this ongoing rivalry here. It’s called fantasy football. That would be this week’s score between Ben and myself, which is a massive disappointment because my team is superior to his in every aspect, in every which way.

Ben:                 Except for points scored and position on the scoreboard. First place, baby.

Jordan:             If there’s any console to this problem is that my bench scored enough points to beat him, but that’s an unfortunate circumstance for the manager as he did not play those people.

Ben:                 Well, Jordan, if anything, we want everybody listening to this podcast to know that we are absolutely winging it today. We’re not going to hide the fact. This was not a planned episode, but we found a couple of topics we wanted to talk to you about. If you think that these breaking news episodes are interesting and you want more real time information on what’s happening in the SEO community, shoot us a tweet. I know we read this in our outro for every episode, but we would love to hear from you if this format of content is interesting. Send us a message. Jordan’s handle is JTKoene. Mine is BenJShap. We actually created a new Voices of Search podcast Twitter handle. It is Voices of Search, @VoicesOfSearch on Twitter. If you’re interested and you like this episode, shoot us a note, let us know you’re listening, let us know how you feel, and tell Jordan he sucks at fantasy football.

Jordan:             I knew that was coming.

Ben:                 All right. On that note, that wraps up this episode of the Voices of Search podcast. Thanks for listening to my conversation with second place in the TenderNob fantasy football league, oh, and the lead strategist and CEO of Searchmetrics Inc, Jordan Koene. We’d love to continue this conversation with you as I mentioned, so shoot us a tweet. Jordan’s handle is JTKoene, J-T-K-O-E-N-E. Mine is BenJShap, B-E-N-J-S-H-A-P. The show’s new Twitter handle is VoicesofSearch. If you’re interested in learning about how to use search data to boost your organic traffic, online visibility or to gain competitive insights, head over to Searchmetrics.com/freetrial for a free trial of our SEO suite and content experience software. If you liked this podcast and you want a regular stream of SEO and content marketing insights in your podcast feed, hit the subscribe button in your podcast app and we’ll be back in your feed next week. All right. That’s it for today, but until next time, remember, the answers are always in the data.


Google News Optimization: 11 common mistakes and how to avoid them

News articles from publishers draw traffic from a variety of channels, most notably Google web search and Google Discover. In this article, we’ll explain what’s important for your Google News Box & Discover optimization and what mistakes you should avoid.

If you’re a publisher looking to learn how to optimize your site for Google News and organic search, then our SEO and Content Consultants can give you practical advice on how to improve your reach and your traffic. Curious? Ask for more publishing success now:

Request an appointment!

Overview: Traffic sources for publishers

The table shows the different traffic channels for article pages of news publishers. It also shows whether the channel drives traffic to an HTML or an AMP page, and how high the traffic potential is for each channel.

Traffic source
Traffic lands on
Traffic potential
HTML AMP
Google Web Search, Desktop, News Box (Top Stories) x never very high
Google Web Search, Desktop, Organic Rankings x never high
Google Web Search, Mobile, News Caroussel very rare x high
Google Web Search, Mobile Publisher Carousel x x middle
Google Web Search, Mobile, Organic Rankings x x high
Google News Vertical (Tab in Web Search), Desktop x never low
Google News Vertical (Tab in Web Search), Mobile x never low
news.google.com, Desktop x never low
news.google.com, Mobile x x low
Google Discover x x high

The potential is highest on Google Web Search and Google Discover. That’s why this article runs through 11 errors that are common for these channels:

Mistake #1: No resort pages (“section pages”)
Mistake #2: Technical issues in news article template
Mistake #3: Same article on different URLs (duplicate content)
Mistake #4: Anchor Text on Category Page ≠ Heading or Page Title
Mistake #5: Image links or JavaScript links on category pages
Mistake #6: No XML News Sitemap
Mistake #7: No Accelerated Mobile Pages (AMP)
Mistake #8: Underestimate Google Discover or focus purely on news
Mistake #9: Value proposition of the company is insufficient
Mistake #10: No SEO in the editorial office / no SEO editor in the newsroom
Mistake #11: SEO not integrated in the company / too few IT resources

Mistake #1: No department or section pages

Theoretically, a publisher can rank with their news articles without being included in the Google Publisher Center. In practice, however, they have to register in the Publisher Center and arrange the individual section pages in so-called labels. This is the only way to ensure that Google can crawl the news pages classified in their departments. Importantly: The URLs of the section pages must not change. If changed, Google will no longer find a valid URL under the label. This can prevent the news articles from ranking.

Google Publisher Center: Labeling of category pages

Mistake #2: Technical issues in news article template

News articles can only rank if the Google crawler can easily read the layout and format of the news page. The page layout has the following requirements:

  1. The article pages must be in HTML format and the body must not be embedded in JavaScript.
  2. The headings of the articles and the time of their publication must be easily recognized by the Google crawler.
  3. Articles must have a unique and visible date and time, ideally found between the heading and article text.

In addition, you should note the following things:

  • Make the template as light as possible.
  • Use hierarchical headers (h-tags).
  • Include as few distractions (or ads) as possible within the text.
  • Use structured data, e.g. “datePublished” and / or “dateModified” with the correct time zone (see Guidelines for AMP Pages and Guidelines for Non-AMP Pages).

Mistake #3: Same article on different URLs (duplicate content)

Have you got one article on two URLs? Google says, “If you’ve already published your article at www.example.com/news_1.html, do not re-post it later at www.example.com/news_2.html.”

With this in mind, news sites should review how they handle of agency messages. Under no circumstances should you take over agency reports without adaptation (auto-publishing). Otherwise, the content you’re publishing will simultaneously appear on many other domains (external duplicate content). It is better to publish the messages with a revised title, headline and lead. And it is best to completely rewrite agency content.

Have you got two articles on one URL? Google says: “We can not crawl the page www.yoursite.com/messages1.html if it shows a different report every day. Our links to articles only work correctly if each article on a news site is associated with a unique URL. This URL must be permanent.”

Mistake #4: Anchor Text on Category Page ≠ Heading or Page Title

The anchor text that references the article from the category page must match the title of the article and the page title.

Mistake #5: Image links or JavaScript links on category pages

Image links or JavaScript-embedded links can not be crawled. Therefore, make sure that all articles on your section pages contain only HTML links.

Mistake #6: No XML News Sitemap

With an XML News sitemap, you determine which content is transmitted to the various Google News channels. The XML sitemap is a must-have in news optimization. It increases the likelihood that your articles will be found in a short time and displayed in the Google universe.

Tags should help the sitemap provide as much relevant information as possible:

  • the news-specific tags: publication (name, language), publication_date, title
  • more tags like rel alternate, lastmod and image (loc & caption & title)

In addition to news articles, Google also publishes video pages for news keywords. Video tags in the XML News sitemap increase the likelihood that news sites will appear in the video integration on the SERP.

Mistake #7: No Accelerated Mobile Pages (AMP)

Without Accelerated Mobile Pages, domains are missing out on a huge amount of potential. News traffic is predominantly mobile. Many commuters consume news on their smartphone. But only Accelerated Mobile Pages (AMP) will be included by Google in the mobile news carousel.

Mobile News Carousel – with only AMP results

Many news portals with AMP are already getting more traffic through their AMP than through their HTML pages. It can be assumed that AMP traffic will continue to increase in the future. Once AMP is in place, the step to building a Progressive Web App (PWA) is easy.

Other advantages:

  • more reach, clicks and impressions
  • low data volume
  • shortened load times
  • lower bounce rates
  • higher user satisfaction

Many publishers have been successful with AMP.

With Google Discover, publishers can currently (still) be listed without AMP.

Mistake #8: Underestimate Google Discover or focus purely on news

Google continues to evolve the user experience. Google Discover, activity cards, and other innovations show this profound evolution. “Discover is special because it’s always a step ahead: it helps you find things you haven’t even searched for,” it says on Google’s blog. As early as 2018, more than 800 million people were using the feed to keep themselves informed and up-to-date.

Discover shows logged-in users current (audio-)visual content, articles, or other information tailored to search history and user interests.

“When you plan your next trip, you can view travel destination articles and restaurant tips. (…) Using the topics in the Knowledge Graph, you can find out how to predict a topic and help you expand it. “

This will give guitar beginners tips on learning to play chords, while experienced musicians will get advanced picking techniques. While “normal sites” have no chance of being listed in Google Discover, publishers currently get 30% to 60% of their organic traffic from this source.

Google Discover places particular emphasis on evergreen content: topics that are always relevant and up-to-date. News sites should therefore increasingly publish content that isn’t time-sensitive. Evergreen content is also sustainable from the point of view of search engine optimization (SEO): It ensures constant traffic with little effort. In addition, the duration of this traffic is significantly higher in news traffic. Of course, you should not ignore the optimization for the Google News box.

Mistake #9: Value proposition of the company is insufficient

Many news sites go too far away from their established target group. Long-time users trust the publisher and know its credibility. Things are different for new users who come from search engines. They are often not sufficiently welcomed by the website.

News portals must give these users a clear value proposition and create trust. What does the company offer and are their prodcuts presented as irresistibly as possible? These are questions that are usually not sufficiently communicated.

Often missing:

  • a good presentation of the editor,
  • the reference to serious editorial guidelines
  • the goals and history of the company.

To create authority in a specific subject area, you must also focus on your core competencies. What sense does it make, for example, for a politically-engaged daily newspaper to link the area of ​​sport in the main navigation, even though 95% of the articles deal with the topics of politics, feminism, environmental protection and right-wing extremism?

Mistake #10: No SEO in the editorial office/no SEO editor in the newsroom

Many news sites shy away from integrating SEO into their editorial work. Instead of hiring an expert who supports the editors every day in the newsroom, an attempt is made to cover SEO with inexperienced employees. The publisher’s management does not recognize how much know-how is needed for successful news SEO.

If a publisher does have an SEO in the editorial office, they will see the following benefits:

  • Setup of web analytics for measuring SEO vs other traffic channels
  • Editorial support for content planning
  • SEO quality management by SEO editor before publishing
  • Monitoring the news rankings (live monitoring) -> for which keywords are news boxes appearing but my publisher has no content?

Mistake #11: SEO not integrated in the company/too few IT resources

Search engine optimization is only scalable when processes are redesigned in the editorial office. As the Searchmetrics Digital Strategies Group, our advice helps clients to help themselves. We support news sites on their personal “SEO journey”. This support is important at the beginning to ensure that publishers do not set off in the wrong direction.

For example, we recommend that you implement SEO processes as sustainably as possible and not view them as foreign objects in the company that can be outsourced. The organizational challenges associated with doing this well should not be underestimated.

Sufficient SEO know-how and sufficient IT resources are very important core competencies of news portals that will, in the future, decisively decide whether a publisher succeeds or fails.

You need support with becoming more successful in digital publishing?

Request an appointment!


JavaScript Crawling for Better & More Accurate Site Audits

There are over 1.5 billion web sites in the world, and JavaScript is used on 95% of them.

With no additional introduction on the importance of performing an in-depth SEO analysis on your site, regardless of the language programming it uses,  we are proud to tell you that starting today, cognitiveSEO can crawl and perform in-depth site audits on all type of websites, JavaScript generated sites included. 

JavaScript Crawling for Accurate Site Audits

JavaScript Crawling for Accurate Site Audits

With so many sites using content generated with JavaScript, you might feel that there is no need to highlight the importance of JS. Yet, with so many changes happening in the world wide web, it is safe to assume that one might not be up to date with everything that is happening in web development. However, using JavaScript highly impacts a business and its SEO strategy; therefore, being able to run site audits on all type of sites is a must these days.  

So, indulge us to talk a bit about the importance of JavaScript Crawling, about how to run a site audit on JS generated sites within cognitiveSEO and how JavaScript affects SEO. 

Find All Possible SEO Issues on JavaScript Websites

As we mentioned before, cognitiveSEO has just added a brand new feature: the possibility of crawling & analyzing JavaScript websites

You can find all the possible technical and SEO issues a site may have and get recommendations on how to fix them.

cognitiveSEO’s Site Audit analyzes the technical health of your JavaScript website and helps you detect all the weak points of your website before your users do, giving you a competitive advantage on the competitive market we are swimming in. The SEO Audit Tool crawls all the pages it finds on a website, be it JavaScript or not, regardless of the size of your website, and provides a fully customized set of data easy to comprehend and visualize.

cognitiveSEO site audit

cognitiveSEO site audit

The truth is that with the ever-evolving search engines algorithms you need an efficient solution to keep your rankings safe. And cognitiveSEO does exactly that: it lets you know all the issues that might prevent your online business from getting the organic traffic and the high rankings you deserve.

You can set JavaScript crawling for any site with literally one click: simply enable the JavaScript crawling functionality from the editing section and you’re all set. 

javascript crawling site audit cognitiveseo

javascript crawling site audit cognitiveseo

Of course, along with this functionality there are dozens of other customizations you can add to your site audit so it can best respond to your needs: crawl structure, when to crawl, what to crawl, crawl sources, etc. Anything you need to perform a site audit as accurate and reliable as possible. 

buton audit website

buton audit website

Performing a complete JavaSscript website audit will give you a deeper understanding of why your site is not generating the organic traffic you think it should or why your sales and conversions are not improving. This kind of website audit gives you a much wider array of SEO items to look at and can analyze issues of all types that might prevent you from reaching your best possible ranking.

top issues

top issues

Why Is JavaScript Crawling So Important?

There are tons of websites using JS and it’s easy to understand why this happens. Using JavaScript in designing and building websites is very popular as it makes it possible to create interactive and dynamic content. Visual animations, navigation, displaying dynamic content to users based on specific behaviors are just a few of the things that can be done with JavaScript.

Yet, analyzing a JavaScript website is not always a walk in the park. It requires a lots of resources and in-depth analyses. Luckily, you don’t need to worry about this anymore as you can run complete Site Audits within cognitiveSEO, on all types of websites. 

Here are just some of the advantages of enabling JavaScript crawling when running a Site Audit: 

You’ll get a more accurate analysis and you’ll have a better issues identification.

Many sites use JavaScript generated content, be it completely or partially. Even if your site has only a couple of elements generated with JS, it is highly important to get those sections analyzed as well. It’s always the small pieces that make the bigger picture. 

You can find out how crawlable your pages are with and without JavaScript rendering enabled.

JavaScript analysis is the closest thing you get to what your users actually see within their browsers. You can run an SEO audit, with and without JS enabled and see the differences for yourself. 

before and after javascritp crawling cognitiveseo

before and after javascritp crawling cognitiveseo

You can analyze a wider range of websites.

So, let’s say that you want a really cool site template, with a dandy interface, so you choose to create your site using Wix, for instance. That’s awesome! The downside is that in most of the instances you’ll hardly find possibilities of completely analyzing all the technical and SEO issues your site might have.  Yet, the JavaScript crawling gets things done for you, allowing you to analyze sites built with different platforms and programming languages. 

What Is JavaScript and How Does it Work?

JavaScript is one of the most popular programming languages to develop websites. It uses frameworks to create interactive web pages by controlling the behavior of different elements on the page.

Initially, JS frameworks were implemented client-side (front-end) only in browsers, but now the code is embedded in other host software, such as server-side (back-end) in web servers and databases, which will save you from a lot of trouble and pain. The problems started when JavaScript implementation relied only on client-side rendering.

If JavaScript frameworks have server-side rendering, you’ve already solved the problem before it even arises. To better understand exactly why problems appear and how you can avoid them, it is important to have some basic knowledge on how search engines work. For that, we need to establish the phases of the information retrieval process: crawling, indexing and ranking.

How JavaScript Affects SEO

If you want to find out more about the impact of JavaScript on SEO, you need to know that we wrote a more exhaustive piece of content on JavaScript and SEO.

JavaScript means more loading time speed and faster server load (code functions run immediately instead of waiting for the server to answer), easier implementation, richer interfaces and higher versatility (can be used in a huge variety of applications). But JavaScript SEO brings some problems along the way. Lots of webmasters fail to optimize the content that uses JavaScript code. And this is (most likely) because they don’t have a complete site audit overview. 

Everybody started to doubt whether search engines, like Google, are able to crawl JavaScript. And that was the wrong question to ask. The better question to ask is can search engines parse and understand the content rendered by Javascript? In other words, can Google rank your website if it’s made in JavaScript? The truth is that more and more websites are relying on JavaScript, so search engines have improved their page rendering functionality.

Even if not completely maybe, it’s important to know that Google is able to crawl and index the rendered version of pages as well as the HTML version.

So, the questions that pop out are: 

  • Does Google crawl JavaScript?” The answer is more and more lately.
  • Does Google index JavaScript?” The answer is yes.
  • Should I use JavaScript?” The answer is it depends on your needs.

A JS website is indexed and ranked. We’ve learned things the hard way until now. We know that making it easier for Google to understand the generated content is the best approach. To help Google rank content that uses JavaScript, you need tools and plugins to make it SEO friendly. And you need a comprehensive SEO analysis to make sure your site is performing as well as it can. 

It’s not all sunshine and roses, we know. You need to know that JavaScript uses up crawl budget as Google requires more resources to render, crawl and index JS websites, adding a layer of complexity to the process. Moreover, JavaScript can be hard to combine with good SEO.

Yet, regardless of all these matters, keep in mind that a good SEO Audit can save you from lots of bigger troubles; and since you now have the possibility of performing audits on all types of sites, JS included, you have no excuses for not performing a Site Audit right now. 

Start Your Free 7-Day Trial


Going international: How to make your WordPress site globally friendly

International expansion is an expected ambition for progressive WordPress sites and ones of similar likes. The online nature of this global reach means that the uncertainties, legal dangers, and cultural hazards are minimized. The world is at your fingertips, and the costs in reaching it successfully are minimal.

The rationale for reaching out to a new audience, readership, viewership or listenership, maybe one of opportunity, exciting new prospects, high growth potential, or to escape a domestic audience that has become too saturated or competitive.

With only some limitations, the internet is a global phenomenon that effectively ties us all together with invisible strings. Send a Tweet from Prague and reply to it in Illinois. Publish an eBook in Seattle and share it with your friends in Beirut. There are practically no boundaries when it comes to sharing content online.

When it comes to your WordPress website, the one you’ve dedicated time, money and energy building, I expect that you will want it to possess the maximum global reach possible. This doesn’t just happen by chance and requires some key features within your site to make this happen. The following tips and suggested plugins should set you and your WordPress site on the path to international influence.

Four tips to help make your WordPress site globally friendly

1. Globalize your content

The foundation of an internationally appealing website is its content transcreation. This does not focus on the mere translation of words but ensures the recreation of meaning, intent, and context.

It is important to make sure that the meaning of the content does not change when translated into another language and does not convey your message wrongly. Cultural hazards are rife when it comes to the international expansion of any kind. To be accepted and welcomed in a different geographical area, you cannot afford to display misunderstood and potentially offensive content.

Unsurprisingly, over 73% of the global market prefers websites with content in their native language. If people cannot understand the content on your website, you cannot hope to keep their interest. In the same vein, inaccurate translations just won’t cut it. The best option is to find a content writer who can craft the copy in a specific language for better quality content.

2. Avoid rigid localized options

Some websites choose the default website domain and language based on dynamic Geolocation IP tracking. Others do not have rigid local settings and allow their websites to be accessed by users from anywhere. If you are hoping to reach as many readers as possible, this option is best. No matter the country from which your website is browsed, it can be accessed without limitations of location.

3. Avoid using text on images

Google cannot translate text on images. This is the same for logos, headings, and other information. This can be majorly off-putting for readers who do not understand some parts of your website. Further, no translator or software that runs on your multilingual site can translate graphical text. Therefore, avoid it altogether for the best results, or keep it to a minimum for a more international audience.

4. Localize checkout and shipping factors

Whether your WordPress site is an online store or sells software as a service that doesn’t require any shipping at all, your checkout process should be appropriately localized. Currency options are fundamental to users taking that final step to make the purchase. There are WordPress plugins available to allow for multiple currencies to be displayed and chosen from.

If you are giving the option of international shipping then inform the buyer beforehand whether or not the product is available for shipping to his local address. Make the option to convert the currency clear and choose a suitable API tool for currency conversions. In order to keep on track of abandoned cart figures, allow the user to view the delivery charges and taxes prior to checking out. Finally, remember that people from different locations are more comfortable with different payment methods- so ensure to provide multiple options.

Plugins to help make your WordPress site globally friendly

1. TranslatePress

This full-fledged WordPress multilingual plugin translates every aspect of your website. Its main feature is that it allows you to translate directly from the front-end. It allows you to easily switch languages during the translation- and the live preview is updated instantly. All translations of content, theme, plugins and even meta-data can be made without changing the interface.

It is ideal for manual translations. Do it yourself or assign a custom translator ‘user role’ to any user on your site. Users will then be able to translate as and when they want, without needing access to the admin area.

Lastly, the plugin creates SEO friendly URLs for all languages and boosts you up the local SEO results. Ranking well will make this extra effort to globalize your site worth all the while. Once you have established yourself as an authoritative and respectably ranking website abroad, you’re in and can continue the normal operation of your site.

2. Multi-currency for WooCommerce

As discussed, the need for multiple currencies on your international online store is unchallenged. This plugin allows users to easily switch to different currencies and make use of currency exchange rate converter with no limits. It can be used to accept only one currency or all currencies. Multi-currency for WooCommerce helps enhance your site’s user experience and will do so for free. It’s a no brainer.

These tips and plugins will help you achieve your international SEO goals. Wish to add more tips and plugins to this list? Mention them in the comments.

The post Going international: How to make your WordPress site globally friendly appeared first on Search Engine Watch.


5 SEO copywriting mistakes you should avoid

Copywriting is a crucial element of every SEO strategy. High-quality content is what helps you rank and sets your site apart from all the other sites out there. So, you’ll want to do SEO copywriting the right way. Preferably without making any fatal mistakes. Want to know what not to do when you’re writing SEO copy? Let’s have a look at five mistakes to avoid!

Are you struggling with more aspects of SEO copywriting? Don’t worry! We can teach you to master all facets, so you’ll know how to write awesome copy that ranks. Take a look at our SEO copywriting training and try the free trial lessons!

1. Not starting with keyword research

SEO copywriting always starts with keyword research. Now, it may be tempting to skip this time-consuming task and just wing it, but that can seriously harm your chances to rank! You can write tons of awesome content, but if it’s not optimized to rank for realistic search terms that people actually use, it won’t help you one bit. So, always take some time to think about the terms you want to be found for. Take keyword research seriously, it’ll definitely pay off.

2. Forgetting about search intent

Before you start writing, as you’re doing your keyword research, it’s also crucial to take a good look at search intent. You need to have a clear idea of the kind of intent that’s behind your keywords. People could be looking for information, a specific website, or they might want to buy something. Why is search intent important for SEO copywriting? Well, for example, if you write sales-oriented, persuasive copy for a keyword that only has informational intent (rather than transactional) odds are you won’t rank. You simply won’t answer the searcher’s needs. Or, even if you do rank, visitors will likely leave instantly because they’re looking for other types of content. You don’t want that! 

The solution: as you’re doing keyword research, analyze the SERPs, so you have an idea of the intent behind the keyphrases you’re targeting. Write your content accordingly, giving some thought to your text purpose, tone of voice, length and call to action, for example. 

3. Not using synonyms

Search engines these days are incredibly smart: they understand that some words can have the same -or similar- meanings. Use that to your advantage! Don’t stuff your text with contrived occurrences of your exact focus keyphrase. Instead, make sure you use synonyms of your keyphrase. Not only will that help avoid repetition – which makes texts boring and hard to read- but you’ll also increase your chances to rank for related keywords!

In short: don’t make the mistake of neglecting synonyms and related keywords or keyphrases. Did you know that Yoast SEO makes optimizing for synonyms and related keywords a lot easier? The Yoast Premium plugin finds your focus keyphrase, even if the words appear in a different order in the same sentence. In addition, it allows you to optimize your text for synonyms and related focus keyphrases. That makes the plugin (almost) as smart as Google. I guess the real mistake here is not getting Yoast SEO Premium 😉 .

4. Not thinking about the user’s perspective

Something that often goes wrong in SEO copywriting: content that’s solely written from the site owner’s view, without taking the user into account. No wonder, as it’s very hard to forget everything you know and put yourself in your user’s shoes. Nevertheless, it’s important to try! You’ll soon realize that your user won’t really care about your product-related jargon, or why you think your product is awesome, or your blog post interesting. They want to know what problem of theirs you will solve, or what they will get out of reading your blog post.

Here are two short examples to give you an idea of the difference:

  1. We released Yoast SEO 12.X! It’s full of bug fixes, so we really believe you should update. 
  2. The latest version of Yoast SEO, 12.X, is here for you! Time to update, so you’ll get the smoothest experience of the plugin.

Which of the two appeals most to you, as a user? It’s probably the second one, right? Think about that when you’re writing your own SEO copy!

Bottom line: don’t make the mistake of neglecting your visitor’s perspective in your SEO copy. That also includes writing too much ‘I’ or ‘we’ in your content. Make it about your user, not yourself!

Read more: Engaging your online audience: 8 practical tips »

5. Writing unreadable texts

A final SEO copywriting mistake that people often make, is that they write posts that are hard to understand. A huge missed opportunity! If you make sure your it’s easy for everyone to understand the message of your text, you’re opening up your content for a wide audience. That’s why writing a clear and readable text is a considerable part of your SEO copywriting strategy. People should be able to understand what you want to tell them. If you create copy that’s easy to comprehend, people will be less inclined to leave your site. They might even want to read your next post.

We know writing is hard. But there are things everyone can pay attention to to write a nice, clear text. Don’t use too many long sentences. Avoid using many difficult words. If you write for more than one region, check if you didn’t make any confusing mixups. Check whether the structure of your text is clear. The Yoast SEO plugin helps you with the readability analysis, which includes the Flesch reading ease score. And there are other tools out there to help you write texts that are nice to read, such as Hemingway. So, there are no more excuses not to write a lovely, readable text!

Conclusion on SEO copywriting mistakes

SEO copywriting mistakes are made when people focus too little on the quality of their texts and skimp on preparation. But your rankings will soon pay the price… So make sure each text has an original idea, a story, that’s well-thought-out, factoring in the site’s visitors. And, your copy should be nice and easy to read. It’s a lot of work, but Yoast SEO can help you get on track! And, if your copywriting is starting to look good, you can check for other common SEO mistakes!

Keep reading: SEO copywriting: the ultimate guide »

The post 5 SEO copywriting mistakes you should avoid appeared first on Yoast.


The Definitive Guide To SEO In 2020

Last year I decided to make Featured Snippets a priority for us.

And it helped us go from a handful of Featured Snippets rankings to over 190.

Backlinko – Featured Snippet numbers

Here’s the step-by-step process that I used.


1. Find Featured Snippet opportunities

Your first step is to find:

Keywords that you already rank for.
AND
Keywords that have a Featured Snippet.

Why is it important to focus on keywords that you rank for already?

99.58% of all Featured Snippets are from pages that rank on the first page for that term.

So if you don’t already rank in the top 10, you have zero chance of ranking in the Featured Snippet spot.

How do you find Featured Snippet Opportunities?

Ahrefs “Organic Keywords” report.

It shows you keywords that you rank for… that also have a Featured Snippet:

Backlinko ranking keywords with featured snippet

3,117 keywords? Looks like I have some work to do 🙂


2. Add “Snippet Bait” to Your Page

“Snippet Bait” is a 40-60 word block of content specifically designed to rank in the Featured Snippet spot.

Why 40-60 words?

Well, SEMrush analyzed nearly 7 million Featured Snippets. And they found that the most Featured Snippets are 40-60 words long.

Most featured snippets are 40 to 60 words long

For example:

I wrote short Snippet Bait definitions for every page of The SEO Marketing Hub.

Backlinko – SEO Hub: Content Gap Analysis

And these helped my content rank in the Featured Snippet spot for lots of definition keywords.

Featured Snippet for "content gap analysis"

HubSpot takes Snippet Bait to another level.

They add little boxes to their posts that actually look like Featured Snippets:

HubSpot featured snippet


3. Format your content for other types of Featured Snippets

Snippet Bait works best for so-called “Paragraph Snippets”, like this:

Google search – "YouTube description" featured snippet

Even though paragraph snippets make up 81.9% of all Featured Snippets…

Type of Featured Snippet

…they’re not the only one.

If you want to rank for List Snippets…

Use H2 or H3 subheaders for every item on your list.

If you want to rank for list snippets use H2 or H3 subheaders for every item on your list

Google will pull those subheaders from your content… and include them in the Featured Snippet:

Google will pull those subheaders from your content and include them in the featured snippet

If you want to rank in Table Snippets…

You need to create a table that Google can easily pull data from.

For example, the content from this Table Snippet…

Table snippet SERPs

…is pulled directly from a well-formatted table.

Table snippet source site

Which leads us to our next topic…


2019 Google core algorithm updates: Lessons and tips to future-proof your SEO

There’s nothing that beats that organic #1 position in Google’s SERPs when it comes to brand visibility, increase in traffic, trust factor boost, reduction in cost per lead, and so on.

Everyone who’s anyone in online business knows this, which is why the struggle to grab that marketer’s Holy Grail can look like a cut-throat business to many SEO novices.

However, even SEO pros get confused when Google throws a wrench into the intricate workings of the rankings machine. Google’s core algorithm updates can mess up even the best SEO strategies, especially if you react in a panic to a drop in the rankings.

Today, I’ll share with you the three things I’ve learned from 2019 Google algorithm updates that will help you future-proof your SEO. First, however, take a look at the hints that Google rolled out alongside those updates to see if you’re building your SEO strategy on a healthy foundation.

2019 Google core algorithm updates and what they tell us

In 2018, Google reported 3234 algorithm updates.

That’s just a bit shy of 9 updates per day.

All of them change how the algorithm evaluates a website and its rankings (most just slightly, though).

However, three of them were so-called ‘core algorithm updates’ – meaning that their impact on the rankings was likely significant for most indexed websites. Google announced these (in March, June, and September of 2019), which is not something that they normally do. This should give you an idea of how important they were in the grand scheme of all things SEO-related.

Google Sear Liaison's tweet on its 2019 Google core algorithm updates

Websites were affected differently, with some seeing increases in their rankings and traffic, and others plummeting to Google’s page #3. Many of the sites that experienced significant drops are in the Your Money, Your Life (YMYL) niche.

(Verywellhealth.com shows a significant drop after the March core update)

“The sensitive nature of the information on these types of websites can have a profound impact on peoples’ lives,” says Paul Teitelman of Paul Teitelman SEO Agency. “Google has long struggled with this and at least one of these core algorithm updates was designed to push trustworthy YMYL content to the top while sinking those websites that contain dubious and untrustworthy information.”

Google signaled a path forward with these updates. If you were not paying attention, here are the key takeaways:

  • Google signals an intent to keep rewarding fresh, complete, and unique content. Focus on answering the searcher’s questions thoroughly and precisely.
  • E-A-T (Expertise, Authoritativeness, Trustworthiness) guidelines are more important than ever. Things like backlinks from reputable websites, encryption, and who authors your posts can make or break your organic rankings.
  • Google wants to see you covering a wide range of topics from your broader niche. Increase your relevance with content that establishes you as the go-to source in your niche.

SEO is far from an exact science.

If anything, it’s educated guesswork based on countless hours of testing, tweaking, and then testing again.

Still, there are things that you can do to future-proof your SEO and protect your websites from reacting too violently to core algorithm updates.

Based on Google’s recent hints, here are three things that you should focus on if you’re going after those page #1 rankings in the SERPs.

Three tips to future-proof your website’s SEO

Keep the focus on high-quality, actionable content

I know you’re annoyed with hearing it by now but high-quality content is a prerequisite to ranking at the top of the SERPs and staying there.

This means that you need to pin-point a specific question that the searcher wants answers to and then write a piece of content that provides a detailed clarification of the issue. Does it need to be 5,000 words long? That depends on the question but, in most cases, it doesn’t. What it needs to be is concise and thorough, and clarify any and all questions that the searcher might have while reading it.

Ideally, you will want your content to be 1500+ words. According to Backlinko’s Brian Dean and his research, Google tends to reward longer content.

 

Source: https://backlinko.com/search-engine-ranking

My advice is to ask yourself the following questions when you’re writing:

  • Am I providing the reader with a comprehensive answer to their question?
  • Is my content more thorough than what’s already on the #1 page of the SERPs?
  • Am I presenting the information in a trustworthy way (citing sources, quoting experts)?
  • Is my content easy to understand, and free from factual, stylistic, and grammar errors?

If your answer to these questions is a yes, you’re already doing better than (probably) 95% of your competitors.

Improve the E-A-T score of your website

In SEO, E-A-T stands for Expertise, Authoritativeness, and Trustworthiness.

In other words – who is authoring blog posts and articles that are published on your website? Are they penned by an expert in the field or by a ghostwriter?

Why should people trust anything you (or your website) have to say? That’s the crux of E-A-T.

The concept appears in Google’s Quality Raters’ Guidelines (QRG), and SEO experts have debated for years whether or not it has any bearing on the actual organic rankings.

In 2018, Google cleared all doubts around it, announcing that QRG is, in fact, their blueprint for developing the search algorithm. “You can view the rater guidelines as to where we want the search algorithm to go,” Ben Gomes, Google’s vice president of search, assistant and news, said in a CNBC interview.

Here’s what the QRG has to say about E-A-T

Source: https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf

We have no idea if Google’s core algorithm can evaluate E-A-T parameters as well as an actual human rater. Still, if that’s Google’s end goal, it’s a good idea to pay attention to it now, regardless of whether it’s implemented or not. It most certainly will be at one point in the future.

To improve your E-A-T score, focus on the following

  • Add an author byline to your posts – every post that you publish should be authored by someone. Use your real name (or your author’s real name), and start building a reputation as an expert in the field.
  • Create your personal website – even if you’re trying to rank your business site, make sure to have a personal branding website of your own (and of any regularly contributing authors). Those websites should be maintained – you don’t need to SEO the heck out of them but you should publish niche-relevant content regularly.
  • Get featured on Wikipedia and authority websites – QRG clearly instructs raters to check for author mentions on Wikipedia and other relevant sites. That stands to reason because experts in the field will often be quoted by other publications.

(Image source: https://static.googleusercontent.com/media/guidelines.raterhub.com/en//searchqualityevaluatorguidelines.pdf)

  • Get mentions on forums – same goes for forum mentions. If people name-drop you on relevant forums, that means that they feel you have something important to say.
  • Secure your site with HTTPS – security is an important E-A-T factor, especially if you’re selling something via your website. An unsecured website will have a low E-A-T score so make sure to invest in encryption to boost trustworthiness.

Build quality backlinks and establish a social presence

Quality backlinks are still a very important ranking factor.

However, according to a report released by Backlinko, it’s not about one or two backlinks, regardless of how strong they are.

What moves the ranking needle are sustainable, evergreen link-building strategies – backlinks from trusted, niche-related websites that are acquired by white hat SEO methods such as blogger outreach, guest posting, and collaborations with other influencers in the niche. The more of these types of backlinks you get, the better your organic rankings.

Additionally, getting backlinks from a greater number of referring domains ensures that your rankings are protected if, for example, a couple of those websites get shut down or penalized in the future. When you’re playing the link-building game, it pays to think ahead.

(Image Source: https://backlinko.com/google-ranking-factors)

And, while they don’t carry the same weight as true backlinks, you’d be wrong to underestimate the value Google’s ranking algorithm places on social media signals.

A truly authoritative website – and all the authors that write for it – will have a strong social media presence. They will use it to amplify their message, build additional authority, and drive traffic to their website. Ahrefs’ Tim Soulo does this better than any other SEO expert that I know.

how having a strong social media presence helps create authority and tackle 2019 Google core algorithm updates

All of this will affect the aforementioned E-A-T parameters. If nothing, it will distribute your name far and wide, signaling to Google that you’re not a complete nobody that just happens to run a website or write a blog about a certain topic. The stronger your social media presence; the more followers, comments, and shares you end up earning – the better it is for your E-A-T.

Get people to trust you and the algorithm will follow

Pretty soon, the key to top rankings will be how believable and trustworthy you are. Google’s current insistence on E-A-T parameters clearly demonstrates that. Everything else will be just the icing on the cake after that – the fancy schema you’re using, the on-page SEO gimmicks, and all the other loopholes SEO experts are now using to rank their websites.

I’m interested to hear what you think about the direction that Google is taking with this year’s algorithm updates. Have any of your websites been affected? Leave a comment below and let’s discuss.

The post 2019 Google core algorithm updates: Lessons and tips to future-proof your SEO appeared first on Search Engine Watch.