Go back to previous page
Forum URL: http://www.dombom.com/cgi-bin/dcforum/dcboard.cgi
Forum Name: The New MadBomber Marketing and SEO Forum
Topic ID: 484
Message ID: 8
#8, RE: 4 Making Pages Using FatBomb
Posted by Kurt on Oct-18-07 at 07:28 AM
In response to message #7
Hey Kelvin,

I've also thought about articles a lot and how to use them with Fatty.

Thing is, I haven't come up with a way of automating it, or even making it semi-automatic.

The strategy also depends on what type of rights you have for the content.

Even though I don't have the "perfect solution" yet, I'll post some strategies that may stimulate someone else...Some of them will only take a couple of hours of labor, in case someone doens't mind.

Kelvin said:
>>Well, like before, I have hundreds of tips, paragraph size that could loaded up into fatbomb, to create hundreds of unique pages and searches, especially niche.

In this case, it's farily easy...Put them all in a text file, then get each "chunk" on a line.

In Textpad, select each chunk and ctrl+j (join lines).

Once you're finished, Open Tuelz=>Linez Tuel and prefix each line with:

The "normal" FatBomb format is:
URL|Title|The description goes here...

We can leave the URL and TITLE blank, but both should be blank and not one of the other. This leaves us with just content and no links:

||The description goes here...

Quick Note: You can use html in the description if you choose.

This will work very well for Kelvin's situation. Kelvin can jack up the weight of this database so that these tips are listed first, then use some of the scraper resources to fill out the pages.

Kelvin could even do some "spinning" and substitute different keywords and come up with multiple versions of each "chunk".

Now, let's say you have some PLR articles or other content you own, such as your own...You could split them into chunks as describe above.

Or, you can post them on another site(s) of your's. Then use the URL and TITLE, then still split them into chunks.

Put each chunk+title+URL into seperate text databases and upload to Fatty. This is actually a very effective search relevancy techinque. If an article is spread between multiple databases, the articles that have the keyword in the most databases will score a higher rank, due to the "frequency" bonus.

Since keywords were found in multiple chunks, it means the article had the keywords throughout the entire article, and not just in one paragraph.

Make sense?

The above strategy can also be used for articles that come from places like Ezine Articles and is a good, legit way to dedoop articles, as well as build links to the articles on your sites.

First, post the article in its entirety on one of your sites, including resource box.

Then, "chunk it" into 5 or more text database files, using a paragraph or so in each file.

EACH chunk will need to be in this format:
URL to Full Article|Title|Chunk

Now, different chunks will come up with the use of diffent keywords, and each chunk will be mixed and matched with other stuff and scraped resources, creating a unique mixture of content and words.


Let's say you have a bunch of PLR articles in text files, and each article also uses html to at least seperate paragraphs.

Put each in a text database file of it's own and upload to Fatty.

In theory, you could have 500 text database files. Again, I haven't tested Fatty to check its capacity for tons of text databases.

But if this works, also in theory, instead of using the Fatty admin to accept each database one at a time, you could open the fatbomb/data/config.dat file in a text editor and enter all the databases that way.

Those that have mastered the Replacez Tuel could probably do this very quickly.

Fatty gives you tons of ways to mix and match content...It isn't limited to just scraping outside resoruces by any means.

I encourage everyone to spend some time on an ongoing basis and develope your own content/databases over time. This is how to create content that's truly one of a kind and of decent-good quality.