The FatBomb Forum is no longer active. Use it for reference only.
For active discussions of Fatbomb, please visit:
The MadBomber Forum

Also see:
FatBomb ReadMe  :-:     :-:
  Newbie Install Guide

Subject: "Sucking pages - Converting dynamic to static pages"   Previous Topic | Next Topic
Printer-friendly copy    
Conferences FatBomb Topic #49
Reading page 1 of 1 pages
Kurtadmin click here to view user rating
Member since Dec-5-02
8831 posts, 5 feedbacks, 8 points
Apr-07-06, 06:02 PM (PST)
Click to EMail Kurt Click to send private message to Kurt Click to view user profileClick to add this user to your buddy list  
"Sucking pages - Converting dynamic to static pages"
 
I'll get into this a little more, but for now here's a quick intro...

The first thing you need to do, is download the free page "sucker":
www.httrack.com

It's free/open source and what it does is take pages and download them to your website to browse off-line.

However, we'll use it to convert our dynamic pages to static html.

Making pages with HTTrack:

I won't get into the specifics of HTTrack, other than this general guide: This guide assumes you've created a "skin" for Fatbomb that will match your site.

In Fatbomb admin, be sure to config it for the resources you want to use.

Open HTTrack and:
-Name the project
-Create a category (not sure what this really does)

-Click NEXT
Here's where you do your "work".

Paste in a long list of URLS in the "WEB ADDRESSES" box. We don't need to spider the site, instead just enter URLs.

Note: Be careful, you don't want to suck so fast that you bring your own site to its knees, or make your host mad.

To set your "seed limit" Cick "Set Options" on the add URLs sceen, then click "Limits" and adjust appropriate limits.

If you can't figure out the proper "Limits", then just do 10-20 or so pages at a time...But the last thing you waht to do is bring down your own server.

- Click NEXT

- Click FINISH

Your Fatbomb pages will be sucked and downloaded to your hard drive.

-If you own Tuelz, you can touch them up from there.

HOw to make URLs to suck pages using Notetab:
Free at www.notetab.com

-Enter your list of keywords

-Find:
_ (_ represents one blankspace)

Replace:
- (hyphen)

Goal: To add a hyphen between all multi-word phrases, replacing the blankspace.

-Add your base URL to at the beginning of each phase:

Find:
^p

Replace:
^phttp://YourDomain.com/cgi-bin/fatbomb/fatbomb.cgi/?skin=default&&keywords=

Or if using the htaccess file...
Replace:
^phttp://YourDomain.com/

Note the ^p at the beginning of the URL.

This should put the base URL before all your keywords...

Experts with Tuelz:

Run this list through Replacez to vary the length of your pages...
length=20
length=8
length=29
length=17

Say from 8-30 results. A problem with scraper pages is they all have the same number of results per page. This gives your pages variety...

You can repeat this steps over and over again, using different "skins" and resources/databases each time to give you a vast assortment of pages for any given niche.

Also, run Namez to give your page file names some keywords. Don't worry about matching each up perfectly. Note: I STRONGLY suggest using the htaccess method, which will give your pages the elvis-presley.html file names.

You can also use Tagz and Replacez to mess with your pages titles and other stuff, so that your pages look more unique and less "machine made".

The main purpose of sucking Fatbomb pages is to create static html pages that are not only server friendly, but they can be used on a variety of other domains without having to reinstall cgi scripts on each and every one.

While the initial sucking takes some server load, from that point on you'll have pages that load as fast as possible and require very little server resources.

I'll get into this a little more...Just download www.httrack.com for starters...


-Boom boom boom boom.


  Alert | IP Printer-friendly page | Edit | Reply | Reply With Quote
sedriskillteam
Member since Apr-28-05
66 posts, Rate this user
Mar-29-07, 06:00 AM (PST)
Click to EMail sedriskill Click to send private message to sedriskill Click to add this user to your buddy list  
1. "RE: Sucking pages - Converting dynamic to static pages"
 
   Great idea. I want to point out that these Niche portals are a great 'bonus' for membership sites and ebooks. A custom niche search engine, a nice bonus that few are offering and could really set you apart.

When I told my wife what I was doing to bring added value to 2 of my present niches she looked awestruck and said "Really...how'd you do that?" All my wife cares about in IM is cashing checks so this reaction made me take note how someone intersted in these subjects and willing to spend money on information would be.

But I have a question, after I 'suck' the pages they have extensions that look like this:

www.theautismfiles.com\cgi-bin\autism\autismtalk.cgi\index5157.html

no the only problem is I want to put link these static pages on the skin using ssi includes or perhaps linkbomb, but I cant when I cant determine what keyword index5157.html refers to. Is there an htacces trick Ican use and 'resuck' the pages or another way of doing this I am missing?


steve


  Alert | IP Printer-friendly page | Edit | Reply | Reply With Quote
Kurtadmin click here to view user rating
Member since Dec-5-02
8831 posts, 5 feedbacks, 8 points
Mar-29-07, 06:49 AM (PST)
Click to EMail Kurt Click to send private message to Kurt Click to view user profileClick to add this user to your buddy list  
2. "RE: Sucking pages - Converting dynamic to static pages"
 

>no the only problem is I want to put link these static pages
>on the skin using ssi includes or perhaps linkbomb, but I
>cant when I cant determine what keyword index5157.html
>refers to. Is there an htacces trick Ican use and 'resuck'
>the pages or another way of doing this I am missing?
>

Steve,

The only way I know is to use mod-rewrite/htaccess to create "static" pages before sucking. You would just need to make a list of the pages you want to "suck":

www.theautismfiles.com/kids.html
www.theautismfiles.com/children-health.html
www.theautismfiles.com/elvis-presley.html


Let me say, it's been a while and I "THINK" this will work.

Kirill can install this for you at a reasonable fee, but I'm not knowlegdable enough to give instructions.


-Boom boom boom boom.


  Alert | IP Printer-friendly page | Edit | Reply | Reply With Quote
sedriskillteam
Member since Apr-28-05
66 posts, Rate this user
Mar-29-07, 08:40 AM (PST)
Click to EMail sedriskill Click to send private message to sedriskill Click to add this user to your buddy list  
3. "RE: Sucking pages - Converting dynamic to static pages"
 
   sure just have kirill give me a shout..


  Alert | IP Printer-friendly page | Edit | Reply | Reply With Quote
Kurtadmin click here to view user rating
Member since Dec-5-02
8831 posts, 5 feedbacks, 8 points
Mar-29-07, 08:41 AM (PST)
Click to EMail Kurt Click to send private message to Kurt Click to view user profileClick to add this user to your buddy list  
4. "RE: Sucking pages - Converting dynamic to static pages"
 
>sure just have kirill give me a shout..

Steve,

Email me with the general details and I'll pass it on to Kirill. You'll need to pay Kirill directly.


-Boom boom boom boom.


  Alert | IP Printer-friendly page | Edit | Reply | Reply With Quote
Kurtadmin click here to view user rating
Member since Dec-5-02
8831 posts, 5 feedbacks, 8 points
Mar-29-07, 08:47 AM (PST)
Click to EMail Kurt Click to send private message to Kurt Click to view user profileClick to add this user to your buddy list  
5. "RE: Sucking pages - Converting dynamic to static pages"
 
For those wanting to make and suck pages like this, use the Linez Tuel.

I don't want this thread to be about Linez, so no questions about it, instead post on the Linez thread, in the Tuelz forum.

I will post this quick guide:

1. paste your list of keywords into Linez.

2. Find: (enter a single blank space)
Replace: - (hyphen)

3. Suffix lines with: .html

4. Prefix lines with: http://www.yourdomain.com/
(assuming this is how the mod-rewrite is set up)

5. Paste results/URLs into HtTrack and run the program.

You can create tons and tons of static pages this way.

For best results, use your own custom databases and/or the ability to give the best results more "weight" so they appear higher in your search results.


-Boom boom boom boom.


  Alert | IP Printer-friendly page | Edit | Reply | Reply With Quote


Conferences | Topics | Previous Topic | Next Topic
Rate this topic Rate this topic
Y>