A dilemma for fitness sharing with a scaling function


Research output: Book Chapters | Papers in Conference ProceedingsConference paper (refereed)Researchpeer-review

27 Citations (Scopus)


Fitness sharing has been used widely in genetic algorithms for multi-objective function optimization and machine learning. It is often implemented with a scaling function, which adjusts an individual's raw fitness to improve the performance of the genetic algorithm. However, choosing a scaling function is an ad hoc affair that lacks sufficient theoretical foundation. Although this is already known, an explanation of why scaling works is lacking. This paper explains why a scaling function is often needed for fitness sharing. We investigate fitness sharing's performance at multi-objective optimization, demonstrate the need for a scaling function of some kind, and discuss what form of scaling function would be best. We provide both theoretical and empirical evidence that fitness sharing with a scaling function suffers a dilemma which can easily be mistaken for deception. Our theoretical analyses and empirical studies explain why a larger-than-necessary population is needed for fitness sharing with a scaling function to work, and give an explanation for common fixes such as further processing with a hill-climbing algorithm. Our explanation predicts that annealing the scaling power during a run will improve results, and we verify that it does.
Original languageEnglish
Title of host publicationProceedings of 1995 IEEE International Conference on Evolutionary Computation
Number of pages6
Publication statusPublished - 1995
Externally publishedYes


Dive into the research topics of 'A dilemma for fitness sharing with a scaling function'. Together they form a unique fingerprint.

Cite this