You&A with Matt Cutts at SMX Advanced 2014 (& Where is the Penguin Update?)

Each year at SMX Advanced Danny makes a You & A with Matt Cutts where members of the public submit questions, Danny asks them to more valid for a wide audience anyway) and Matt answers them. I'm tweeting this year's session and it's all about the horizon of these tweets with everything I personally wanted to add. ; -)

 Danny and Matt at SMX Advanced 2014 "width =" 1024 "height =" 451 "/> <a href= Danny and Matt pre-cat Michelle Robbins Ha.

This discussion began with Matt throwing stuffed hummingbirds and other objects from the scene, Danny asked if Matt had any ads (he said that "I'm not sure what's going on"). he had a lot of them) and said we were going into a "question" then a "announcement" format for the cat.

I had used my own hashtag for my tweet in direct (#SMXYA for SMX You & A), so you can see the complete list here (I've pulled selected tweets about the topics discussed for this post and I've developed some tweets below)


Matt says the second part of the last iteration of the payday loan will be launched later this week, maybe tomorrow. #smx [1 9459011] #smxya

– Rae Hoffman (@sugarrae) June 12, 2014

The First Announcement made by Matt was that the latest payday loan update (referred to as Payday Loan 2.0) was actually an update in 2 parts and only the first part had launched earlier this month . He said he was expecting "Part B" to be launched soon – "probably later this week" but probably "tomorrow."

Danny then asked how Part B of Payday Loan 2.0 differed from Part A. Matt said Part B would focus more on "spam requests" against "spam sites".

Now, allegedly the Payday Loan algorithm has always focused on "spam requests" but it's possible Google was punishing sites on a "level of site "while now they may be going on a punishment on a" query level ", but that's only speculation on my part. Matt goes no further than the above on the subject

UPDATE 6/6/14 ADD: Apparently, Matt tweeted on 6/12 that Payday Loan 2.0 "Part B" started rolling out:

@BtotheMcG it starts now

– Matt Cutts (@mattcutts) June 12, 2014

However, if that's the case, Barry Schwartz says that the webmaster community does not seem to notice it.


Matt says the metafiltre was not affected by Panda ]

– Rae Hoffman (@sugarrae) 12 June 2014

Danny then asked Matt what happened with MetaFilter. Matt said unequivocally that MetaFilter was not touched by Panda. Matt said that MetaFilter is a typical high quality site, although he noted that it was a high quality site with an outdated design / user interface.

He then reiterated that MetaFilter was not only affected by Panda, but that he was not affected by Penguin either. He added "there are many different algorithms that we are launching". He mentioned that when MetaFilter published an article about their lost traffic, one of the things they suspected was that Google might have considered them spam after an email received by Google.

Matt said that they had "checked their records" and that in fact, they had never quoted MetaFilter as a spam link to anyone – no matter what. One had taken the Google inserted the MetaFilter link by itself.

Matt seemed to suggest that MetaFilter did not get any manual help for their traffic, but that Google was trying to find out what was wrong. the first place and place planned to repair that algorithmically.

I removed two things from this discussion. The first was that they were able to "check their records" to find out if they had already cited MetaFilter as a bad link.

The second was that – according to the MetaFilter message, their traffic losses coincided with Panda's updates (the shared MetaFilter chart makes it too difficult to see the exact date of their mega hit and they're n & # 39 never gave the date in the post to confirm Panda from the outside – but two Panda Refreshments were launched in November 2012 ), yet Matt said that they would They have never been touched by Panda, but have been hit by a different algorithm.

Google therefore launches hidden algorithms or updates at the same time. Panda refreshes? If so, does this mean that some sites that think they are affected by Panda are not? We know that Panda is one of the most successful algo to recover from – if other updates are released at the same time, it could mean that we Abyss the bad tree by "repairing" some sites.


Matt tells w new system every time he rejects a request for reconsideration, the reviewer now has the option of. to add a note (on first request for reconsideration) #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014


Matt then went into the way Google is trying to handle a little better requests for reconsideration of manual penalties. At a first request for reconsideration, it would appear that the process was that a site was refused or withdrawn from the manual – the reviewer apparently could not add notes. Only when multiple reconsideration requests were made for the same site did the evaluators have the opportunity to start communicating with the webmaster.

Matt added that the reviewer now had the opportunity to add a note on "

Later in the meeting, Matt also admitted that they know that They need to do a better job of communicating with and communicating with the small business owner.


Danny asked about deploying Panda 4.0 and payday 2.0 at the same time (Matt said Part A of Payday 2.0) and Laughing #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

Danny questioned on what we are all pissed off – why is Google cutting back the updates? Are they trying to confuse us? Matt does is seems that Google has tried not to overlap the updates. In the specific case where Danny was asking questions, Matt said that Payday Loan 2.0 (Part A) should initially leave early on the weekend, while Panda 4.0 was to be available later in the week. Matt involved a series of strange events (he was * very * vague about what that was exactly) that pushed them to get closer to what Google had originally planned. He said that their goal was not to confuse webmasters.


Danny asks why webmasters do not understand "you were hit by Panda" type messages #smx #smxYA

– Rae Hoffman (19459001) @sugarrae) June 12, 2014

Danny asked why we were not getting the "you were hit by Panda" notices. I took Danny's question to mean why we had not received notifications to be algorithmically affected by an update in the same way as we do manual actions in GWT. Matt replied that they were trying to let webmasters know about big algorithm updates (with a big focus) when they do them by making a public announcement – which did not answer the question. Either I was wrong about what Danny was asking for, or Matt was taking what Danny was asking for, or Matt figured out what Danny was asking for and avoiding answering with that answer. ; -)


re GWT, matt says you should check – added and rendered as googlebot – he says that they can fetch JS and Ajax now #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

Matt makes some announcements to news in GWT, as well as features "coming soon" in GWT.

The first was that GWT recently added a feature of [“19459028] extraction and rendering like Googlebot "in GWT.He said that they can go get Ajax and JavaScript now. now says that Googlebot can understand more code, we should stop preventing crawling of JS and CSS files. "He said that" more help "was coming to GWT regarding files robots.txt and that more help also came for ahreflang.Also, on the list "coming soon" there was more help with regard to application indexing errors.

Matt testified that they made improvements to GWT as far as this process and suggested that improvements for site relocations would be in the form of documentation and documentation. features, but that was my opinion and not explicitly stated. No timeline e xact (or vague) for the "coming" characteristics


danny asks a question about a penguin update – matt says that he does not believe that they had a penguin update – they were focused on the panda #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

Danny has asked if there had been an update of Penguin since the last announced update (which was on October 4, 2013 for those who keep track). Matt said that he did not believe it. Danny was pissed off for a second – asking Matt how he could "not know" if an update had occurred, LOL (thanks Danny, because we all thought that). Matt hinted that they were focused on Panda's version 4.0. He then said – no lie – that an engineer came up to him and said that it was probably time for a penguin to cool off and that Matt was in agreement with him … and the subject has changed.

panel) were asked about Penguin's next update in the Ask the SEO session the next day – and Greg Boser had said that he believed that it was going to happen soon and that it would be essentially the biggest update at the moment. I had added that I had a tendency to agree – with all the information we provided to Google about sites that are crappy in the last eight months in the form of disavowals – that- it should be big.

WHY? There is so much to do to sneak up on penitentiary punishment?

Danny questions the fact that people make the "path of shame" – danny says it's a punishment for publishers, not for spammers ] #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

This was the best way I could title this part of the discussion, LOL. Danny asked why Google was making it so difficult to recover from a Penguin penalty. Why the need to remove links – the "bond of shame" as opposed to simply disavowing things and ending it. Matt said it was a "fair question", but he was very vague when he responded.

He commented on how spammers could build tons of spam today and disavow tomorrow – I think that implied that they would be able to penalize and penalize a domain too easily, so the "link "and the time it takes to recover from Penguin – but I could be wrong.

Danny then offered an alternative solution. Make it easier to bounce from the first hit, but adopt the firmest position on all subsequent hits for spam link activity. Matt did not seem to like this idea – but gave no specific reasons why, LOL


Matt said that & # 39; 39, they had recovered IE8 #smx #smxYA – Danny says "you mean referrers that do not tell us anything?"

– Rae Hoffman (@sugarrae) June 12, 2014

Matt said that IE8 referrals were back and Danny replied, "you mean the referrers who do not tell us anything? " And Matt laughed. He said that they now show you IE8 referers, although yes, these are just put in Google referrers. I think his reason for mentioning it was that all sites with a significant IE8 user base would know why they could suddenly see a bump in Google referrals.


Danny asked how it is said a year ago that GWT could store a year in kw data, ask where it is. at #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

Matt had said a long time ago that Google Webmaster Tools was working to show you / store years of keyword value (right now, they show 90 value days). Danny asked when this promise would be fulfilled. Matt said that he had seen someone tweeter in a previous session to download the data every ninety days (it was a tweet from a commentary). that I had done in the search for keywords on "Roids". He said that he knew it was not ideal, but gave some sort of answer "that's what it is". No calendar was given when – or if – the "data years" would arrive


Danny says that at this point it seems like it would be easier for matt. say which links are allowed #smx #smxYA danny asks the construction of a dead link?

– Rae Hoffman (@sugarrae) June 12, 2014

Danny said that at this point, Google should start telling us what which is allowed because it seems like it would be a much smaller list to keep than what is not allowed. Danny asked Matt if the bonding was dead. Matt said, "No, the link building is not dead."

Then Danny clarified that he was not asking if the links regarding Google's algorithm were dead – he asked if really going out with the goal to build links "was dead." Matt then referenced a blog post that Duane Forrester had written that said:

"You want the links to surprise you. You should never know in advance if a link is coming or where it is coming from. If you do, it's the wrong way.

Matt was in agreement with the sentiment of Duane's remarks about building links in this post – however, he said you should never know in advance that a link was coming "was going a bit too far" with regard to Google.He went home that it was normal to create amazing content knowing that it would help you to create links – provided that the content is really incredible and that people read because it's amazing.

@mattcutts says "it's easier to be true than to pretend to To be real " #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

And this was summarized Matt suggested that what Google is challenging is not a website developing great content created in the hope of attracting er links – he implied that the problem was to "build links" through all the ways that Google had explicitly stated. guidelines – and through bare minimum efforts where the content was not spectacular and the purpose was only to obtain a link (in relation to a link and users and conversions and advertising ).

Danny application can really evaluate the value of a page without links? matt yes, it is possible #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

Danny then asked if Google could really have an indication of the value of a page without links (in reference to the video presented in this article ). Matt said yes, it was possible. Danny asked if Google could disable cold turkey links then. Matt did his famous "uhhhh" and laughed. I assumed that meant the answer was no. ; -)


The following questions focused on things that would have been a factor in Google's algorithm.

Danny asks if G uses the author's rank for anything other than in-depth articles – Matt says "good essay" #smx #smxYA

– Rae Hoffman (@ sugarrae) June 12, 2014

Danny said that it was not difficult to answer – is this is used, yes or no? . Matt stated that the author Rank was used for In-Depth articles (something that Google had already confirmed ). Matt absolutely did not want – and did not – give an answer. He did not say yes, but he did not say no either. Then Matt mentioned that he was a fan of the author's rating – and the subject has changed.

Danny asks if G looks at the rankings of engagement on the site – Matt says in general that they are "open to watch the signals" BUT #smx #smxYA

– Rae Hoffman (@sugarrae) 12 June 2014

The MAIS was only while they were open At the signals, Matt is "extremely skeptical" of the use of site engagement factors to face value and scale in the algorithm because they are very prone to manipulation.

danny asks if the sites will get a boost for SSL, matt currently says no help #smx #smxYA

– Rae Hoffman (@sugarrae) 12 June 2014

Matt says that there is "currently" no boost for a site as they use the SSL. Matt once again mentioned that he is a fan of SSL – he said that everything that makes the web safer is better for all of us. Danny asked if this meant that Google would default to https version of a site on the http version if Google knew and could access both versions. Matt said that at one point, there was indeed favoritism for the http version, but he thought it had been deleted since.

Matt says +1 not used in general rankings – it is clear that they can affect custom rankings if peeps in your circle #smx #smxYA

– Rae Hoffman (@ sugarrae) June 12, 2014

Later in the session, Danny asked if Google+ was dead. Matt answered no, with a hurt LOL voice. Matt said that G + data was not used in the general rankings. He added that if you search for Google, you will probably see effects on your personal ranking as a result of your Google+ activity ( I agree ).

danny is trying to dig in the use of social signals re FB etc – matt says that's why engineers do not want to come looking for shows #smx #smxYA ha

– Rae Hoffman (@sugarrae) June 12, 2014

Danny tried to get an admission – positive or negative – about whether Google was watching social signals from networks other than Google+. Matt jokingly replied that this is the reason why search engineers do not want to come looking for shows. In other words, we have no answer to this question.

Is the speed issue important for danny? Matt says sites that are extremely slow (he quotes "like 20 seconds") need to worry, not others #smx #smxYA

– Rae Hoffman (@sugarrae June 12, 2014

Later in the session, Danny asked if the speed of the site mattered in terms of your rankings. only the extremely slow sites needed to worry about the speed of the site (he used the example of "like 20 seconds"). AKA, it seemed to imply that a site loaded in 2, 4 seconds had no advantage over a site that was charging in 4 seconds.


a time attached – a small offense, smaller time #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014 [1945] 9004]

The day before You & A, I had asked Matt a question to Danny via Twitter. My question was:

"Matt talked a lot about timed penalties, saying that the length is often determined by the offense.Is there a case where a site can get a timed penalty that does not Does not appear in the manual actions tab of GWT? And if all the penalties, even those with "timers", appeared in GWT, how do you know if yours carries a "timer"?

Danny asked my question using his own wording .. Matt laughed and said that he knew where this question came from and called me in the audience. I made a hand sign.

Matt said that if you get a manual, it will be visible in GWT It seemed to imply that all the manuals had some kind of timer attached to them. 39, the offense is small, the more time is short.It certainly has not answered the part about how we know com long time the "timer" was set for any manual penalty we had incurred.

He still said that all the textbooks have finally expired (which is not new info) – but I would like to note here that Matt has said in the past that if you do not fix the reason of a manual and that it expires, he is confident that Google will find you and hit you with another manual fairly quickly.

So here is my takeaway – Google will never tell us how long a "timer" penalty will last on the basis of the rules that were broken because that would mean we would be willing to take the risk if we know the "timer"

The other thing I've removed from this is that it seems like (think aloud here) there may be two aspects to remove a manual penalty. We are in the process of repairing it and having Google delete the manual action in GWT. That is Google confirming that you have corrected it and no longer violates their guidelines. The second aspect is waiting for the "timer" to expire – and how long it takes for the debate and based on the "level" of your offense on the basis of a scale "bad to really bad" to which we have no idea. ]

There have been multiple reports of people correcting textbooks but seeing no recovery despite the removal of Google 's manual action notice after a successful reconsideration request in GWT. I imagine that your timer has not yet expired, although Google has acknowledged that you have corrected the root cause by removing the manual action message in GWT.

Matt was clear that manual penalties were accompanied by timers. It was also clear that all manual penalties eventually expire despite the fact that the offense that caused the penalty was or was not fixed. What Matt was NOT CLEAR was if the "timer" was left (even though many months were based on what you had done) even AFTER you removed the manual action notification in GWT.

In addition, Matt was clear indicating all the manual penalties show in GWT. I will say however that I do not necessarily believe that's the case 100% of the time personally, but, according to Matt, that's how manual penalties roll.


danny asks if you disavow a domain once, and later decide that you do not want to disavow it, can you remove it? #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

So, you disavowed a link (or someone you hired in a crazy offer to save you from Penguin) that was not really a bad link. Asked Danny jokingly, is there a way to "tear" him? Matt said yes, that you could basically "reavow" a link by removing it from your current disavowal file and re-uploading it with the link you want to "remove".

However, Matt seemed he absolutely did not like the "taste" of those words in his mouth. I could not tell if it was more because he imagined people were "tearing up" some of their crappy connections after getting a manual penalty removed with that knowledge or if it was because A link was somehow "damaged" with a disadvantage and he did not want people to accidentally shoot themselves in the foot. Or it could not have been one or the other. But, it really seemed like something about discussing "recessive" links that made it uncomfortable.


I have seen dull talk for years – it is VERY severe expressed how important it is for us to get our shit together #smx #smxYA

– Rae Hoffman (@sugarrae) June 12, 2014

Matt has talked for several minutes about the subject of the mobile. Il n&#39;arrêtait pas de dire combien il était important pour nous d&#39;être prêts pour le mobile. Il a demandé à l&#39;auditoire combien de personnes avaient un balisage d&#39;auto-remplissage sur leurs formulaires de site mobile. Presque personne n&#39;a levé la main. Danny a dit "ce n&#39;est pas mobile" et Matt a dit "oui c&#39;est". Il a dit que l&#39;Internet dominant mobile "arrivant plus vite que la plupart des gens dans cette salle se rendent compte."


clarifie qu&#39;ils savent que d&#39;autres s&#39;inquiètent re – il impliquait un prêt sur salaire ) venir aidera à combler certaines failles de sécurité #smx #smxYA

– Rae Hoffman (@sugarrae) 12 juin 2014

C&#39;est là que taper rapidement en essayant de vivre en tweet vous donnera parfois haha. Dans le tweet j&#39;ai dit la partie A quand je voulais dire la partie B.

Danny a demandé à Matt ce qui se passait avec un référencement négatif. Matt a déclaré que Google était très conscient du référencement négatif – mais en quelque sorte clarifié, ils sont très conscients du nombre de personnes qui s&#39;inquiètent de SEO négative. Il a laissé entendre que Payday Loan 2.0 Part * B * serait de fermer certaines des failles que les gens utilisent pour le référencement négatif.


Danny a demandé comment la recherche serait différente pour les wearables.

réponses mattes w ainsi sur colibri LOL #smx #smxYA

– Rae Hoffman (@sugarrae) 12 juin 2014

Matt a répondu, "ainsi à propos de Hummingbird" et a sorti son téléphone qui a Google Now . Il a demandé à son téléphone "où est l&#39;aiguille de l&#39;espace?" Et son téléphone a répondu avec l&#39;adresse. Il a ensuite demandé, "Je veux voir des photos" et son téléphone lui a montré des photos de celui-ci. Matt a demandé, "qui l&#39;a construit?" Et son téléphone a répondu. Matt a demandé "quelle est la taille?" Et le téléphone a répondu. Matt a dit, "montrez-moi des restaurants près de là" et son téléphone lui a montré une liste de carte d&#39;entre eux. Matt a dit, "que diriez-vous de l&#39;italien?" Et son téléphone lui a montré une liste de restaurants italiens. Matt a dit, "naviguez vers le plus proche" et son téléphone a adopté sa carte avec les directions. La salle a claqué.

Matt a dit qu&#39;il pensait que cela montrait à quel point la recherche portable serait différente. Il a dit que Hummingbird était sur le point de relier ces points. Matt a également admis que cela fonctionnait mieux avec le mobile qu&#39;avec le bureau, car les gens ont tendance à utiliser un langage plus naturel avec le mobile. Il a dit qu&#39;il s&#39;attendait à ce que les ordinateurs de bureau s&#39;améliorent alors qu&#39;ils en apprenaient plus sur l&#39;utilisation mobile.

Quelqu&#39;un demande si les liens JS sont traités comme des liens reg, vont-ils passer? credit – matt dit "surtout, oui" #smx #smxYA

– Rae Hoffman (@sugarrae) 12 juin 2014

Matt a dit "la plupart du temps, oui" en réponse à la question de savoir si les liens JavaScript sont traités comme des liens réguliers. Matt a également fait remarquer que vous pouviez ajouter l&#39;attribut nofollow aux liens JavaScript et que Google le verrait.


danny pose une question sur le contenu de buzzfeed / quizz #smx #smxYA

– Rae Hoffman (@sugarrae) 12 juin 2014

Danny a demandé à Matt ce qu&#39;il pensait de Buzzfeed et des sites comme eux qui produisent essentiellement un contenu superficiel que les gens mangent sur les médias sociaux. Matt a déclaré que Buzzfeed les avait contactés pour leur demander pourquoi ils ne se classaient pas mieux. Matt a déclaré que tout le monde pense que leur propre site Web est de qualité supérieure à la moyenne, même lorsque leur moyenne ou inférieure à la moyenne. Il était évident qu&#39;il pensait que Buzzfeed surestimait leur qualité en ce qui concerne la façon dont ils devaient se classer.


danny demande quelles actions G prend contre les tactiques de blackhat utilisées sur des sites comme youtube (spams, vidéos, etc.) #smx #smxYA

– Rae Hoffman (@sugarrae) 12 juin 2014

Danny asked what action Google was taking against known spam tactics – and then used YouTube spam as an example. Matt said they keep their ears open – he seemed to imply they know about spam tactics well before they implement something to take action on them algorithmically. He said targeting these tactics algorithmically can sometimes take some time.


danny asks about hiring link building service – matt says “creativity” will trump every tool avail in the industry #smx #smxYA

— Rae Hoffman (@sugarrae) June 12, 2014

Matt started to point out the differences between “link building” (consisting of PR4, with this anchor text, and “in content” type elements) and building links by “being excellent”.

. @mattcutts says white hat link building is called “being excellent” #smx #smxYA

— Rae Hoffman (@sugarrae) June 12, 2014

He didn&#39;t seem to have an issue with hiring someone to help you be excellent and help come up with creating ideas and help with their execution of those ideas. His issue was in hiring a “link building firm” in the 2009 sense so to speak. AKA, hiring a promotions / publicity / true marketing company is ok, hiring a “link building” company is bad. My take based on his comments – to be clear.


danny asks if there&#39;s a whitelist of sites re penalties – matt starts with “well…” #smx #smxYA

— Rae Hoffman (@sugarrae) June 12, 2014

The last question Danny asked was whether or not Google had a “whitelist” of sites immune to penalties. Matt was really dodgy on this one, simply saying it “could happen”. So Danny asked for an example – a specific site that had been whitelisted. Matt said he didn&#39;t know a specific one to give.

Matt then made it clear that a whitelist would only exist for something that was a known false positive. I wish I could remember the exact wording for y&#39;all to dissect to death, but I don&#39;t. 🙂 It essentially amounted to only in extreme circumstances where a site exhibited a false positive they couldn&#39;t fix algorithmically for whatever reason – again, my take. Matt then said, unequivocally that there was no whitelist – at all – for Panda (I can&#39;t remember if he included Penguin here as well or not).


On 6/20/14 SMX released the video of Matt&#39;s full talk. Check it out below.

There ya go folks. Expanded coverage on the live tweets mixed with a few of my own opinions. Until next time…

Leave a Reply

Your email address will not be published. Required fields are marked *