SEO In One Day
Simply put, SEO is not as difficult as people make it out to be. You can achieve 95% of the effort with 5% of the labor, and you don’t even need to hire a professional SEO to get started. It will also not be difficult to start ranking for carefully chosen key terms.
The most false information is spread regarding SEO, out of all the outlets we’ll be talking about. part of it is subtly presented, but part of it is openly disseminated and accepted by so-called SEO professionals who are incompetent in reality.
Since SEO is so straightforward, it’s generally not worth employing a professional to handle it unless your business is very large.
How Google Works
Let’s take a look at Google’s history and current state of development in order to gain a foundational understanding of what we need to do to rank well on Google and what we need to do for SEO.
The Beginning of Google
Einstein provided the inspiration for PageRank, the original ranking algorithm used by Google. While attending Stanford, Larry Page and Sergei Brin became aware of how frequently renowned articles, like as the theory of relativity, were cited in scientific investigations. These citations functioned almost like a vote: the more frequently your work was cited, the more significant it must be. Theoretically, they could rank the papers in order of importance if they downloaded every scientific publication and read the references.
They recognized that links allowed the Internet to be examined and rated similarly to references, only links could be used in place of references. To find out which websites were most frequently linked to, they set out to “download” (or crawl) the whole Internet. The best websites were, in theory, those with the most links. Additionally, they could evaluate and rank the pages that appeared when you searched for “university” online.
Even though it is now considerably more sophisticated and nuanced, Google still functions substantially in the same way. For instance, not all links have the same weight. A link from an authoritative site is far more useful than one from a non-authoritative site (measured by the number of links pointing at it). A link from the New York Times is probably worth around 10,000 links from low-authority websites.
In the end, Google’s goal is to locate the “best” (or most popular) website based on the search terms you provide.
All of this simply means that we must make it apparent to Google what our page is about before making it clear that we are well-liked. That way, we prevail. We’ll use a really straightforward method to accomplish that, one that requires significantly less work than you probably anticipate in order to be effective every time.
Gaming the System
Google is a very smart company. The algorithms they create are incredibly sophisticated; keep in mind that Google’s algorithms are currently being used to power autonomous vehicles that are driving themselves around Silicon Valley.
If you delve too deeply into the SEO rabbit hole, you may find yourself using spammy methods to try to hasten this process. Instructions to produce spam or spin material, linkwheels, PBNs, hacking domains, and automated tools like RankerX, GSA SER, and Scrapebox are just a few examples.
Some of that stuff only works temporarily, but Google is intelligent and learning. Every day, Google makes it more difficult to defeat it, and it takes less time to take down spammy websites. Most of them don’t even last a week before your efforts and everything you’ve accomplished vanish. You shouldn’t proceed in that manner.
Instead of churn and burn on the Internet, we’ll concentrate on establishing Internet equity. So, if you see a highly compensated SEO professional advising you to use software and spun content to gain links or if you see a blackhatter abusing the system, just know that it’s not worth it. We will quickly establish authority and increase traffic, but we will do it without jeopardizing the long-term viability of your website.
Making it clear to Google what our site is about is the first step in getting it ready to rank.
For the time being, we’ll concentrate on having our home page (our landing page) rank for a single keyword that isn’t our company’s or brand name. However, for the time being, we’ll remain laser-focused. Once we accomplish that and achieve that ranking, we can expand into other keywords and begin to rule the search landscape.
Finding out what that keyword is must be done initially. The amount of traffic and trouble we’ll receive from this endeavor may differ depending on how well-known and how long our site has been around.
The Long Tail
We need to understand a notion known as the “long tail.”
The “popularity” of most objects could be graphed, with “popularity” acting as the Y axis and rank order as the X axis, and the result would resemble a power law graph:
The majority of the attention is focused on a select few significant hits, and the graph quickly declines after a select few hits. According to the long-tail idea, the yellow end of the aforementioned graph will continue to grow taller and longer as our society becomes more varied.
Consider Amazon. Despite the fact that they presumably have a few best-selling items, the majority of their retail sales come from a variety of items that aren’t purchased nearly as frequently as their best-selling items. Similar to this, there would be a small number of hits that would receive the bulk of plays and a huge number of songs that would only receive a small number of plays if we were to rank the popularity of the songs played during the past ten years. We refer to those less well-liked goods and tunes as being in the long tail.
This matters in SEO because, at least initially, we’ll target long tail keywords—very precise, intention-driven search terms with less competition that we know may succeed—and then gradually move to the left.
In the beginning, our site won’t outrank extremely competitive terms, but by being more precise, we can start attracting very targeted traffic with a lot less effort.
We’ll call the terms we’re looking for “long-tail keywords.”
Finding the Long Tail
We’re going to combine four free tools to locate the ideal long-tail keywords for our business.
The procedure seems as follows:
- To generate some keywords, use UberSuggest, KeywordShitter, and a little bit of thinking.
- To gauge traffic volume, export those keywords to the Google Keyword Planner.
- Search for those keywords using the SEOQuake Chrome addon to determine how difficult they actually are.
Be not afraid; it is really extremely easy. We’ll act as though we were looking for a keyword for this book in this example (and we’ll likely need to develop a site so you can check to see if we’re ranked there in a few months).
Step 1: Brainstorming and Keyword Generating
Simply identifying a few keywords that seem to have potential will suffice for this step. At this stage, don’t focus too much on eliminating undesirable keywords because most of them will be done so automatically.
Since this book is about growth hacking, the following keyphrases might be appropriate:
Growth hacking guide
Growth hacking book
Book about growth hacking
What is growth hacking
Growth hacking instructions
That’s a sufficient starting list. Check out keywordshitter.com if you start to run out of ideas. In just a few minutes, thousands of permutations of a single term will start to appear. Get a good list of 5–10 names to start with.
The next step is to enter each keyword into UberSuggest. I enter “growth hacking” as the first search term, and 246 results appear.
We can construct a huge list of keywords by copying and pasting them all into a text editor after selecting “view as text” and doing so.
Use each keyword you came up with in the same way.
We’ll presume you have at least 500+ keywords now. If you don’t, try starting with a term that is more general and broad; you’ll have that many shortly. You should have more than 1500.
Step 2: Traffic Estimating
Now that we’ve compiled a sizable list of keywords. The next thing we need to do is determine if there are enough searches for them to be worthwhile.
You’ll probably see that some are way down the long tail and wouldn’t help us much. One item on my growth hacking list was “5 internet marketing techniques.” We won’t likely pursue that one, but we can use Google to research it for us rather than speculating. This is the weeding-out phase.
Google Keyword Planner
Although the Google Keyword Planner is a tool for advertisers, it does offer us a general indication of traffic volumes.
These numbers are probably only directionally accurate because Google doesn’t guarantee accuracy, but they’re sufficient to put us on the right track.
To use the tool, you’ll need an AdWords account, which you can open for free if you’ve never used one before.
After logging in, choose “Get search volume data and trends.”
Enter your lengthy list of keywords and select “Get search volume.” You’ll see a lot of graphs and data after you’ve done that.
We’re going to export our data to Excel using the “download” option and experiment with it there instead because the Keyword Planner UI is somewhat difficult to use.
We’re going to choose the traffic we want to pursue next.
Depending on the level of authority your site has, this fluctuates a little. Let’s thus attempt to ascertain how simple it will be for you to rate.
Visit SEMrush.com, type in your URL, and check out the number of backlinks in the third column:
According to the number of links you have, this is the highest level of “difficulty” you should pursue as a general rule (this may change depending on how old your site is, who the connections are from, etc.).
Number of Backlinks
Sorting the data by difficulty will help you get rid of anything that is too difficult for your site (don’t worry, we’ll collect those keywords later). You may now just delete those rows.
The fact that Google refers to this volume as “exact match” traffic is crucial to keep in mind. In other words, if a keyword has a tiny variation, we will recognize it if the terms are synonyms but not if they are used in a phrase, thus the traffic will be lower than you would anticipate overall.
After that disclaimer, order the traffic volume from highest to lowest, and then choose five terms that seem appropriate based on this information.
Here is my list:
growth hacking strategies
growth hacking techniques
growth hacking 101
growth hacking instagram
growth hacking twitter
Though they all appear the same, mine could not actually be identical.
Unfortunately, Google bases its indication of “keyword difficulty” on paid search traffic rather than organic search activity.
Let’s start by using Google Trends to concurrently view the term volume and trajectory. You can enter all the keywords at once to see how they compare on a graph. The format for my keywords is as follows:
I’m especially enthusiastic about the purple and red ones, which are “Growth Hacking Techniques” and “Growth Hacking Twitter.”
We’ll now examine more closely at the level of competition for those two keywords.
Manual Keyword Difficulty Analysis
We’ll need to examine each keyword individually in order to determine how challenging it will be to rank for a particular keyword. In order to limit the list, we began by locating some long-tail keywords.
If you install the SEOQuake Chrome plugin, this procedure becomes much simpler. Once you’ve done that, search for something on Google to see some changes.
When SEOQuake is activated, each site’s pertinent SEO information is shown underneath each search result.
The display will change, so click “parameters” on the left-hand sidebar and change them to the following values:
Now, when you conduct a search, you will notice something similar.
SEOQuake also provides a ranking indicator and the following information at the bottom:
Search Engine Index Google has indexed this many pages from this root URL.
Page Links: The quantity of pages linked to the specific domain that is ranked in SEMrush’s index (often quite few in comparison to reality, but since we’ll be utilizing this quantity to compare, it will be relatively apples to apples).
The number of sites linking to a specific page on the base URL is known as the URL links.
Age: The page was initially indexed by the Internet Archive.
Traffic: An very rough estimate of monthly traffic for the base URL
By examining these, we can attempt to estimate the effort required to surpass the sites in these places.
You’ll observe that the indicators’ weights alter. Not all links come from the best sources, direct page links count a lot more than URL links, etc., but if you do some research and experiment with it for a bit, you’ll get a very decent understanding of what it takes.
If your website is brand new, it will take a month or two before you begin to build the amount of links necessary to rank first. It can just be an issue of getting your on-page SEO in order if you have an older site with more links. Typically, both will be present.
Remember that our page will be tailored to this particular keyword, giving us a slight advantage. To be clear, it’s a tough fight if you start to see pages from websites like Wikipedia.
Here are a few examples to help you understand how you should approach these issues, beginning with “Growth hacking techniques.”
A well-known website, Entrepreneur.com, specifically mentions “growth hacking techniques” in the headline. Although there are no links that lead directly to the page in the SEMRush index, it will be challenging to surpass this.
(By the way, I wonder how hard it would be to write an article for entrepreneur.com – I could probably do that and get a few connections to that quickly, even connecting to my site in the piece).
I had never heard of the website yongfook.com. There are 206 total links, not much traffic, so I might skip this one. Even though the title openly mentions “Growth hacking tactics” and it has quite a bit of age, it would be difficult to pass this one up after a while.
Alright, so Quicksprout is a relatively well-known website with a large number of links, a good age, a lot of traffic, and a small number of links that point directly to the page.
But nowhere in this sentence is the word “tactics” mentioned. Since this page isn’t optimized for that particular term, I could definitely win by making it optimized for “growth hacking tactics.”
Let’s descend a bit to see how difficult it would be to make the first page.
17 pages in total are indexed. designed in 2014? No links to the root URL or even the index? I wrote this one. I should have no trouble front-paging.
So it appears that this is an excellent keyword. All that’s left to do is set up the on-page SEO and begin creating some links.
After choosing a keyword, we must inform Google about the topic of our website. Making sure the appropriate keywords are used in the appropriate places is all that is required to achieve this. The majority of this has to do with the HTML tags that make up a webpage’s structure. Simply give this list to a developer and they should be able to assist you if you don’t know how to use HTML or comprehend how it functions.
Here is a quick checklist you may use to determine whether your content is optimized.
On-Page SEO Checklist
☐ Your keyword appears in the title tag, ideally towards the beginning (or at the beginning) of the tag.
☐ The first few words of the title> tag, ideally, should contain your keyword.
☐ Less than 65 characters are visible in the title tag (optional but preferred).
☐ In addition to having a h1 tag on your website, your keyword is in the first h1 tag.
☐ Your keyword or synonyms appear in the majority of the additional header tags (h2>, h3>, etc.) on your page.
☐ Every image on the website has a “alt” tag with your chosen keyword in it.
☐ The meta description contains your keyword (and there is a meta description).
☐ The page has at least 300 words of material.
☐ If not on the homepage, your term is in the URL.
☐ The first paragraph of the copy includes your keyword.
☐ The page contains additional instances of your keyword (or synonyms; Google now understands them)
☐ Between.5% and 2.5% of your content is keywords.
☐ There are dofollow links on the page, which simply indicates that you are not utilizing nofollow links on every other page.
☐ The page’s content is unique, not copied from another page, and it differs from other pages on your website.
If you have everything in place, you ought to be good to go from an on-page standpoint. Unless you’re in an extremely competitive market, you’ll probably be the best-optimized page for your chosen keyword.
Off-page optimization is now all that’s left.
Off-Page SEO is only another term for links. (Sometimes we refer to them as backlinks, although the concepts are the same.)
Each link on the internet is treated by Google as a weighted vote. According to Google, when you link to something, you are essentially saying, “This is worth checking out.” Your vote bears more weight the more credible you are.
Strangely, this voting mechanism is referred to by SEOs as “link juice.” When a reputable website links to you, such as Wikipedia, they are “passing link juice” to you.
However, link juice doesn’t just travel from one site to another; if your homepage links to other pages on your website and is considered to be very authoritative, link juice is also passed. This makes the structure of our links extremely crucial.
Checking Link Juice
You can use a variety of tools to find out how many links point to a particular website and what the authority of those linking pages is. Unfortunately, none of them are ideal; crawling those pages is the only method to discover which connections point to your website.
The most popular pages are crawled by Google multiple times daily, but because they don’t want you to manipulate them, they update their index very slowly.
However, Google Search Console, formerly known as Webmaster Tools, allows you to view at least a portion of Google’s index. Once you reach your site, Select “Search Traffic” on the left side, followed by “Links to your site.” Although it’s hotly contested whether or not this genuinely displays all of the links Google is aware of, at the very least, it’s a representative sample.
Click “More” next to “Who links to you the most” and then “Download this table” to see all of your links. Once more, it appears that this only downloads a portion of what Google is aware of. The “Download latest links” option offers links that are more recent than the other choice.
Unfortunately, this makes it difficult for us to determine the value of the links or identify links that have lost authority or the sources of those links.
There are many tools available to use, including If you have a limited budget, I recommend starting with ahrefs.com because they have the largest index, then moving on to Moz’s Open Site Explorer (much of the data is available with a free membership; otherwise, it’s somewhat less expensive than ahrefs), and finally, SEMrush, which is free for the majority of the things we need. To give you an impression of the general health and quantity of links pointing to your site, MajesticSEO combines “trust flow” and “citation flow,” which also functions reasonably well.
To establish the “authority” of a link, each of them uses a different internal metric, but it can be useful to compare them side by side.
HTML links like the following:
<a href=”http://www.somesite.com” title=”keyword”>Anchor text</a>
The link takes you to http://www.somesite.com, the title is mostly a relic from the past, and the connected text is known as the “anchor text.” Imagine the words that are blue that you click on.
Relevance of the anchor text is important in addition to a page’s link juice count.
In general, you should try to an internal link using your keyword as the anchor text whenever possible. External links (from other websites) shouldn’t have their anchor text overly optimized. If 90% of your links use the same anchor text, Google may raise an alert and suspect that you are engaging in unethical behavior.
If you ever create links, only use something general like the site name, “here,” or the whole URL (as we’ll show you in the future).
In general, you don’t want orphan pages (those that are not linked to other pages) or an excessively disorganized link structure.
According to some, the optimal link structure for a website looks like this:
That comes near, however there are a few errors. In addition, in a perfect world, every page would link to every other page on the same level, so you’ll never have a structure that orderly. This is simple to accomplish with a footer that resembles a sitemap or a list of “recommended” pages. By doing so, you can freely pass link juice from page to page and customize the anchor text for links.
Unfortunately, drawing such a web would result in a mess, so you’ll just have to picture what it would truly look like.
Before we begin to receive the initial links pointing to our site, there is only one more thing we need to discuss.
Robots.txt, disavow, nofollow, and other minutia
Managing potential problems makes up the majority of SEO. There is a lot of that, but we’ll discuss what will satisfy 99% of demands instead, and if you need anything truly absurd, just Google it.
The page url.com/robots.txt is present on almost every website, including Google.
It’s possible to instruct search engine crawlers what to crawl and what not to crawl using this simple text file. Except for the Bingbot, who essentially ignores your instructions and does what it wants, most people are really decent listeners. (Mainly joking.)
You may simply “disallow” a page in your robots.txt file by writing disallow: /somepage if you don’t want Google to crawl it (for example, a login page you don’t want indexed, a landing page, etc.).
It will also disallow all child pages if you add a trailing / (for example, disallow: /somepage/).
Technically, you can define distinct rules for various bots (or user agents), however it’s simplest to begin your file with “User-agent: *” if separate crawling rules are not required.
Unfortunately, this leads to some undesirable behavior from bad actors as Google will punish spamming websites. Let’s say, for instance, if your goal was to eliminate a rival. You could send a lot of blatantly spammy links to their website and make them pay for it. In highly competitive keywords, this is referred to as “negative SEO,” and it frequently occurs. Typically, Google tries to act as though it never occurs.
In the event that this does occur, you can “Disavow” links in the Search Console, essentially telling Google not to count this one. I sincerely hope you never need it, but if you employ (or have hired) a subpar SEO or are under attack by a rival, that is how you defend yourself.
The “nofollow” property of a link may look something like this:
The following link: a href=”http://www.somesite.com” title=”keyword” rel=”nofollow”>”Anchor text”
You can use a nofollow link if you want to link to someone but don’t want the link to count as a vote (passing link juice), or if you promote user-generated content and want to stop spammers from spreading their spam. Google claims to take those links’ value into account. I’m not sure they give them a significant discount, but other SEOs seem to, and if nothing else, that seems to discourage spammers.
You can utilize a 301 redirect if you’re changing a URL but don’t want its link juice to go. Most of the link juice will be passed by a 301.
When two pages are nearly identical, you can add a link that says “hey, treat this page as if it were that page instead, but I don’t want to 301 it,” for example: link rel=”canonical href=”https://www.someurl.com/somepage”>.
We are now prepared to create our first links.
SEO starts to matter more when it comes to link building, and this is when many people get themselves into serious trouble.
The ideal strategy for link building is to avoid link building. In the past, I’ve worked for businesses where I didn’t have to beg for them because they came naturally through the press, customer blogs, their fantastic blog entries, etc. If this is a possibility, you’re in wonderful shape (and we’ll discuss a few strategies to make it more likely).
If not, we’ll at least start by manually creating a small number.
Instead than hiring an Indian to produce them, we will produce them legally. That is a formula for disaster, and I can’t even begin to count how many times I’ve witnessed it bring down a website.
What SEOs refer to as “web 2.0s” are the simplest technique to build high-quality links. That’s just another way of saying “social sites” or websites where you can post things. Now, tweeting a link into the void will accomplish nothing, but profiles, status pages, etc. do have some weight. Additionally, if they originate from a well-known domain, that counts as a link.
Some of the simplest are:
Tweets (in your user bio)
Github (a repository’s readme)
YouTube (a video’s description must really receive views)
WordPress (yep, you’ll actually need to start a blog)
Blogger (in this case)
Upvote-based sites (HackerNews, GrowthHackers, Inbound.org, etc.)
You can at least start there and obtain six to twelve links. There are usually lengthy lists of “web 2.0s” that you can get online, but bear in mind that if you want to develop something for a blogging platform, you will need to develop it thoroughly. It takes a lot of effort and content, but you have to do it well.
We typically maintain a longer list of Web 2.0s here. Even though some may be outdated, you should generally only create six to twelve Web 2.0s.
Purchasing an expiring domain is another strategy to gain link authority. Although it is more challenging to achieve this, there are several possibilities, like expireddomains.net. (Search “expired domains” and you’ll discover dozens of sites keeping an eye on them.)
You should buy an expired domain and use an archive to restore it as closely as possible to its previous state. You can pass the link juice to yourself from these sites, which most likely have some link juice to pass on.
Using a link intersection tool is another technique to locate areas where you can establish links. They identify websites that connect to “competitor a” and “competitor b” but not to you. Theoretically, they should be open to linking to you if they do so for both of your rivals. Tools for link intersection are available from Moz, Ahrefs, LunaMetrics, and other companies.
After establishing a few fundamental links, we’ll concentrate on certain tactics that will send ongoing links and press until we reach a point where we no longer need to build any more links.
Your First Drip of Traffic: Establishing Your Site as an Authority
Fantastic – your website converts effectively, your SEO is set up, and you are ready to attract visitors. What’s next?
A site that converts really well even without any traffic going to it still converts zero traffic, as you have probably learnt by this point.
We’ll take care of that.
This phase requires a lot of time and work, and at first you might not feel like you’re accomplishing much. Remember that class in college that was so challenging that it served as a good way to screen out students who weren’t prepared to major in a particular field?
In terms of growth hacking, this is the weeder-out chapter.
Take a Long-Term View
The reason why so many individuals struggle with this step is the same as the reason why they struggle with so many other steps that only need a little work over time, such as losing weight, contributing to a 401(k), etc. Beginning with a small amount of traffic, you’ll be looking up to people who have enormous oak trees and thinking, “I must be doing something wrong.” You’re not acting improperly at all. Before becoming a deluge, the traffic starts as a trickle.
If you’re a startup, however, don’t worry. In addition to creating Internet equity, our objective is to generate enough traffic to make the endeavor sustainable (i.e., we won’t pass away before we start to reap the benefits).
We aim to develop a steady stream of traffic that will continue to grow over time. We want to generate traffic now that will continue to provide us with a small trickle in five years. Using hundreds (or thousands) of tiny trickles, our conversion-optimized website, and a fantastic product, we will build a vast river.
This chapter will concentrate on traffic that is network-agnostic because later chapters will go into greater detail about the networks we must drive traffic from. traffic that is unavailable to us over any particular network.
I’ve seen this technique produce over 500,000 visitors per day, though it took nearly a full year to get there. This is just to give you an idea of scale. What might you accomplish with 500,000 daily visitors?
We’re going to start by introducing ourselves (and our organization) into the conversation wherever it’s happening using the keywords we discovered in the SEO chapter.
BuzzBundle software will be used to accomplish this.
With the help of this software, we are able to:
- Always keep an eye out for any mentions of a particular subject, rival, or term across all websites that allow comments, including Facebook groups, Quora inquiries, and blog posts.
- Please allow us to post a helpful comment about our brand or business.
Disclaimer: You haven’t seen any SEO comment spam like this.
This process requires thought, work, and a real person who is typing in real language. You can’t successfully automate this step without it turning spammy, albeit I don’t mention this very frequently. This will likely work if you’re attempting to imitate the automated SEO spam you’ve seen on other blogs and websites, but you’ll get blacklisted, your clickthrough will be significantly lower than it could be, and you’ll get banned.
We won’t launch some horrible piece of software to spam various internet comment areas with twisted mentions of junk in the hopes that this will increase our SEO traffic. Our comments need to accomplish two things:
- Maintain context. Only the subject of an article or tweet will be discussed, and references to our company will only be made when they make sense.
- Make a statement in the discussion. By reading your reply, I should gain knowledge or benefit in some other way.
A few adjustments will occur if you carry out these two actions: As a result of your thoughtfulness and desire to contribute, you’ll first notice that people click on your links. Due to your thoughtfulness and desire to give, people will respect your business.
After that disclaimer, we’ll go into the specifics of how this is carried out.
Let’s start working with BuzzBundle now that everything has been cleared up.
Accounts and Personas
Go to Accounts -> Add new accounts in BuzzBundle first. Since we need accounts to comment, this is where we’ll start everything.
You’ll note that BuzzBundle enables you to use numerous accounts. Although I believe it is beneficial to have diverse points of view and consequently multiple views, I don’t want to go too far and come across as spammy.
Making 2-3 personas that you identify with (or are) and adding them to your BuzzBundle accounts is something I’d advise.
Personally, I don’t even change my identity; I just go by a different name (e.g., Austen J. Allred vs. Austen Allred) or use a few different photographs to avoid having the exact same name and image used all over the Internet.
There are certain limitations to the commenting system Disqus, which is widely used. There are two ways to avoid being banned by Disqus if you consistently use the same link in posts:
- Use many different accounts, change your IP frequently, and occasionally use a proxy.
- Use the URL of your website as your “display name”
Although they both function, I think the second one is considerably simpler.
It will be quite helpful to use links with our UTM parameters here. As a result, we will be able to identify which blogs or websites are generating the most traffic so that, if necessary, we may focus even more on them.
You could find it helpful to employ a few URL shorteners or some 301 redirects if you ever start to experience issues getting your link posted.
You can use a link shortener that 301s, like bit.ly, to make things easy, or if you want to invest a little more work, you can put up your own site and 301 the traffic from a certain page to your money site.
The BuzzBundle will be our first topic.
You will first be prompted for a keyword. Even though we already have a keyword from the SEO part, we might want to go with something a little more general. I’m going to choose “growth hacking” for this.
Simply press “go” to launch BuzzBundle.
Different content kinds will be loaded into separate columns, but in general we will scroll through until we discover something that looks interesting and like we can actually add to it.
I clicked on this as my first selection:
It is an assessment of another growth hacking book. It was as simple as leaving a remark, tagging the author, asking him to review our book, and offering to send him a free copy. This is going to be embarrassing if you are reading it right now.
This person will likely think the dialogue is absolutely genuine since it is, I’m assuming. The addition of a link to his video, which viewers who are looking for something else will find, is just a perk.
As an aside, if I’m just going to be leaving a remark in my own name, I much prefer to hold “shift” and click on a link to open it in my default browser.
The next one I discovered was a compilation of excellent growth hacking blog pieces from the previous week.
I wrote the next comment:
Please take note of the fact that I followed him on Twitter to make it clear that this is a personal message rather than a spam comment. Even more so, just for fun, I slightly overreached and tweeted at him.
You recruit team members in this manner.
As you go along and have an understanding of how to obtain a positive reaction, I’d advise beginning to sort by reach, increasing the quantity of keywords you’re looking for, and possibly—gasp—upgrading to BuzzBundle’s paid edition.