Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Optimization Algorithms

KMKLabs
March 14, 2018

Optimization Algorithms

Hybridation with optimization algorithms aims to produce better result (compare to single algorithm) and more efficient. For example, PSO is used to determine the optimum cluster centers of K-Means Clustering. Finally, the quality of clustering were evaluated using Silhouette Coefficient by examining how clusters are separated and how compact the cluster centers are.

KMKLabs

March 14, 2018
Tweet

More Decks by KMKLabs

Other Decks in Programming

Transcript

  1. Optimization  Animal Behavior (?) An Optimization Algorithms Insiperd by

    Collective Animal Behavior: • Cat Swarm Optimization • Ant Colony Optimization • Bee Colony Optimization • Particle Swarm Optimization (bird flocking)
  2. Optimization is.. • Why optimization approach are needed? Some people

    oriented not only in quality of solution but also an effective time takes.. • What kinds of optimization problems can be solved by Optimization Algorithms? The problems offer some alternative solutions..
  3. Brute force VS Optimization approach • In brute force attack,

    system will generate all possible solutions, it takes a lot of time (depends on length and combination of password char) but sure the method will always gain the most optimum solution. And also the solution generated is specific, no more solutions. • Then, how about the shortest route? There are some solutions and optimization approach will find the optimum or (mostly) a near-optimum solutions. • What if the brute force to be optimized? It’s okay, however.. we need a lot of effort while there’s no guarantee when iterations are stop only to find the fixed optimum solutions.
  4. Particle Swarm Optimization • PSO is a population based stochastic

    optimization technique developed by Dr. Eberhart and Dr. Kennedy, inspired by social behavior of bird flocking • Particles (representation of soutions) “fly” through the solution space”, and are evaluated according to some fitness criterion after each timestep. In every iteration, each particle is updated by following two “best” values (personal best & global best) • Parameters: inertia weight, swarm (number of population), number of iteration, coefficient acceleration 1 & 2 and some parameters according to the problem optimized • Components: swarm of particles, particle’s position, particle’s velocity, personal best, and gobal best.
  5. Scheme of Particle Swarm Optimization Best solution value: global best

    untill n iteration (best solution iteration) Known best value: The optimum solutions
  6. Pseudo code of Particle Swarm Optimization For each particle Initialize

    n particles End Do For each particle Calculate fitness value If the fitness value is better than best fitness value (pBest) then set current value as the new pBest End Choose particle with the best fitness value form all particles and set as gBest For each particle Calculate particle velocity Update particle position End While (if termination condition isn't attained)
  7. e.g: the shortest route problems • What is the best

    route to take if I wanna go to Karet Rail Station? (I am in Bekasi Planet). Which route would be the shortest roadtrip?
  8. e.g: the shortest route problems What if there some rules

    to be attained? • All routes/nodes must be traced • Some nodes cannot be connected directly • Or.. some nodes are accessible after through the other specific nodes
  9. Application of Particle Swarm Optimization • Case: How to gain

    the optimum cluster centers of K-Means Clustering? • Background: K-Means doesn’t guarantee a unique clustering because it will always generates different results with randomly chosen initial cluster centers • Paticle representation: • Fitness: = σ =1 { σ =1 , } , ,+1 ,+2 ,+3 ++ +1, +1,+1 +1,+2 +1,+3 +1,+
  10. Demo! • Here is the simple demo for explaining how

    to optimize cluster centers of K- Means Clustering using Particle Swarm Optimization..