Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / web

HKB Project

3.17/5 (5 votes)
27 Mar 2013CPOL11 min read 19.4K   209  
Website Content and Paid Advertising Optimization

So how many of you out there have done small web site projects, and have launched them or made them public, only to wait many months for the website to start producing results such as a phone call or contact us messages. What if you could launch a website, and have it fully crawled and indexed by almost all search engine vendors in 7 days, and fully tuned in less than 30 days.

Such a task is possible if you have setup your project correctly, and have written the needed tools to monitor the results on a daily basis, to make your corrections.

I wish I code just paste some code snippets that will allow you to do this, but this challenge has been a work in progress for me over the last 10 years, and I would like to share my knowledge and technique with you the Code Project Member. I've have written many tools to aid in this, and have combined them all together to create the HKB Project. 

My project will help you confirm that your website is getting crawled, and that every page has been indexed. Then it will monitor both organic and paid landings, You can monitor paid advertising landings, examine the keywords used, to create a positive or negative keyword list. Negative keywords helps to narrow down your target demographic, with laser like accuracy. 

If you are struggling with promoting your website project, then read this complete article to pickup some tips and tricks that will help you with this task.

Introducing the HKB Project    

So what is the HKB Project? Well HKB stands for Harmonic Keyword Balancing, and was a complex thought that occurred to me back in 2003 when I was studying website construction and how they interact with search engines.   

The HKB Project will assist you in getting your new website projects and marketing campaign content  fine tuned within the first couple of months of operation. Do not confuse the HKB Project with analytic software.  

This is a ASP.net 4.0 server control that you place on the master pages that you want to keep track of. It's packaged as a .DLL file, so you simply drop the file in the bin, create the control element, set your folder permissions and off you go.  

Ive decided to share this program with the Code Project Community to see if the project is worth working on, and to see if there is any value to it. Plus I'm not able to complete the program because I can't figure out how to write the spider I need to collect existing words, and how to display the words collected for comparison, so in other words, my coding skills are still not sufficient to complete the project.

If your a qualified programmer that wishes to work on the project, I will be glad to share the source code with you. but you must have a at least 10K Code Project points to qualify.

HKB Project Conception    

I'm sure your rolling your eyes right now, and think I'm crazy but let me take the time to explain. This is a fun project that I have been working on for over 10 years, but I was limited in time and programming experience to write the complex code needed for full operation, so it's still a work in progress. But now at this point, it's almost ready for commercial purposes, and just needs a built-in spider to self crawl the website, a dictionary of words, and word comparer to finish the project.

When I first started creating websites, I thought I was the master of words. So I created very well written marketing content, grammatically correct, that was word art basically. I was proud of the work and the content, but I was not able to generate organic page landings. As a couple of years went by, I kept asking myself why am I having trouble generating organic search results, what's going on here. Then it hit me like a bolt of lightning one night drinking with friends at a party. I was listening to conversation from my friends, and realized that everybody has their own word dictionary in their head, and each one of my friends used a different phrase to describe something. Different nomenclature was used to ask for things, specify an object, describe an event, and few were the same. That was the answer; I wasn't using the same words that people use to look for things. So how does a person find the right words? Well I guess I need to start collecting the words and store them for later use. At that point I thought that the true master of words are the search engine users.

So that's how it started, I wrote a small program to collect the words used in search queries, on a per page basis, and used the data to correct words on the page. Then I tried creating a broader set of words that cover a wider range of people, and finally tried to obtain some sort of harmonic balance in the word sets on a per page basis.

As a sanity test later the following year, I had customers pick out keywords to use for paid advertising campaigns. Actually at first, they had to think of keywords, until keyword list were created for them to choose from. The results were interesting, in the fact that people ranging from those with PHD's to the high school graduate choose different sets of words that were important to them. Once again that little word dictionary in everybody’s head was different. So I came to the conclusion that no matter what area of the country you grew up in, your level of education, IQ score, age, we are all unique and use different words to describe things.  

HKB Dashboard - Main Interface

 

The Project's main interface has a dashboard for quick viewing of statistics. From here, you can look up Organic Landings, Crawler Statistics and so forth.  

More detailed information can be found here 

http://redcopper-online-marketing.com/EN-US/web-applications/HKB-Project/Default.aspx 

HKB Capture Module

The HKB Project has a front end on the client side called capture, and a back end report and management system.

Capture records data and stores it in a XML file format for later use or parsing. The back end reads the data, and allows you to investigate data points further, to sort of go back in time to the point of origin, where the landing was generated from.  

Capture records direct landings, organic search result landings, and paid advertising landings. It also captures crawler or bot visits, in which crawlers are separated from browsers visits

HKB Crawler Report 

The crawler reporting module will assist you in knowing which crawlers are visiting and indexing your new website project. Use this module to confirm that all pages have been crawled. If you have a database page that uses query string parameters or URL rewriting, confirm that each instance of that page have been registered in the crawler visit journal. For example, I have a part look up page that represents a 1000 items, so I should see 1000 visits to that single page with each instance of the query string parameter.

If you’re not seeing 1000 instances of that single page, then you have an error in the page construction or the way you composed the HTML for it. Best practice is to write a sitemap generator program in your database code that will produce all 1000 versions of the page automatically in a second sitemap.xml file to feed into the search engine. When you have honed your skills to the master level, you can have new projects fully crawled in 7 days by the Google Bot. 

Once you have solved or confirmed your crawler issues, you may now begin analyzing organic page landings, and the keywords used to find them using the Landing Report. The landing report will give you the data needed to make adjustments to your page content. You will now see keyword matches or mismatches, and the page they landed on. Use this data to adjust the actual keywords that are being used to find your page to the keywords you thought they were going to use on the page itself. One example is the keyword "tooth whitening", "teeth whitening" is actually the correct keyword, so you change that word to the latter. 

HKB Keyword Report

Now you’re ready to use the Keyword module. All the different versions of keywords or phrases used are listed here sorted by page. A correctly constructed page will only have keywords and phrases that match that page exactly. You should not have overlaps in keywords for example; if you have 5 marketing bullet points, then you should have 5 unique pages, 1 for each bullet point. Only the home page should contain overlaps of all 5 keywords.

Once you have sorted out your keywords, you’re now ready for Paid Advertising. Use the Keywords in the Keyword module as a starter base for planning your Google Adwords text marketing campaign. Adwords will make suggestions to you to start with, but the suggestions are usually not accurate or too broad in subject. The keyword module will present all words, in which you can pick out the positive ones, and kick out the negative ones to use as negative keywords in your marketing campaign.

So if I take the keyword phrase "solar powered radio", I will get landings for all variants of that phrase such as "windup powered radio". "Powered radio" is the positive word or phrase, while "windup" would be the negative word. The purpose of positive and negative keywords is to tighten your results into a smaller target like a laser beam, and to conserve your advertising cash on only the leads you really want.

In most cases, when you launch a new website project, nobody knows that it exists except for you and the customer. So how do you spread the word about it? The best method is to launch an image advertising campaign, in which you create display ads in various sizes, and post them on other people’s website. Sort of like those billboards along the interstate, which do grab your attention. In this case, were looking for impressions over clicks at first. But when you get your image campaign fine tuned, then you start getting allot of clicks, which generates leads or actual sales. The Landing module will help you fine tune your image campaign, by giving you the information you need to make critical adjustments on a daily basis until its perfect. 

Using the code  

Download the Zip file to examine the sample project and its folder and file structure.

The program is compiled as a DLL file that you drop in the Bin, and then you register the capture program on your masterpages, or home default page.

The back end or admin pages are in a folder that is static in location, but needs write permissions, and the data is stored in the App_Data folder.

C++
<div style="width: 100%; height: 34px;">
    <Capture:redCopper.HKB_Capture_Control id="HKB_Capture_Control" 
        runat="server" HKB_ShowCaptureProgram="True"
        Page_Theme="Black" Powered_By="redCopper" Width="100%" />
</div> 

Points of Interest

The program uses XML files to store data, and is very fast in operation. It is very good at distinguishing between bot or crawlers and browsers, to help keep the data as accurate as possible. There is a Rampart module, that will block bad bots or crawlers such as email harvesters, in which you can add to, or simply delete from.

History 

  • March 25, 2013:  HKB Project V2013

This is the latest version of the project, and is very stable. The dashboard has been added for a quick daily view, and you can them navigate for detailed reports. The search engine vendors have been updated to reflect that Google handles Search results for AOL, and Bing handles Yahoo.  

How to use the demo project - please read this

The demo project is an example of how to setup the HKB Project in your project. If you unzip the demo and run the project on your developer computer in Visual Studio, it will do absolutely nothing. I will explain below.

In the lower left hand corner of the default page, is the HKB Project logo, that you click on, which will load the back end management interface. Now you can login using the credentials provided in the read me PDF file within the project demo.

The project ships with no data. I was going to provide live data, but the data would expire after 7 days, so in order to get data, the project needs to be located in a place where crawlers and bots can access it, so the capture program can begin the process of capturing and storing. Once you have some data, you can then use the back end management to analyze the data, and get a true feel for how it works.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)