Probably SE engineers and CEO’s are asking themselves the same question.
Directory frenzy begun when Google announced his PR mechanism, slower in the beginning but extremely accelerated lately.
Keeping the same trend it’s very accurate to say than in 6 months, with continuously improved automated tools for submitting, the quantity of links to be considered by crawlers will grow, artificially, extremely.
And the content itself will be almost the same while the services provided by directories to visitors (others than submitting webmasters) are practically null.
There is one axiom, one question and one estimation :
The axiom :
Search engines will degrade and penalize some but not all directories.
The question :
Will this happen on automated algorithmic bases or a human rating will be involved ?
This question is linked to the definition of “bad neighborhood” and “link farm”. The algorithm will decide that a directory has over a certain % links to “bad sites” and penalize it ?
Let’s take a look at DMOZ : based on this algorithm Google will need to penalize DMOZ because is crowded with dead links and forbidden content (due to corrupt editors).
So we may think that a “kind” of human rating will be added in the equation. Because it’s obvious that an “AI” can’t judge the value of content, especially related to the usefulness towards live people.
In my opinion all automated directories will be penalized. If the submission isn’t human rated or approved than the directory will be blacklisted.
Also all the directories based on the same principles : approving links without providing any content/human intervention, will be penalized.
The “paid” directories, which contains a handful a “selected” links, will not be penalized but their usefulness as real traffic are questionable and , probably, the PR benefits will be degraded a little.
The “big” directories, counted on the fingers of a hand, will stay with the same weight. Nothing new here 😀 big guys are always friends.