advertisement

advertisement

Information about Order Or Not: Does Parallelization of Model Building in hBOA Affect Its...

Published on July 26, 2007

Author: pelikan

Source: slideshare.net

It has been shown that model building in the hierarchical Bayesian optimization algorithm (hBOA) can be efficiently parallelized by randomly generating an ancestral ordering of the nodes of the network prior to learning the network structure and allowing only dependencies consistent with the generated ordering. However, it has not been thoroughly shown that this approach to restricting probabilistic models does not affect scalability of hBOA on important classes of problems. This presentation demonstrates that although the use of a random ancestral ordering restricts the structure of considered models to allow efficient parallelization of model building, its effects on hBOA performance and scalability are negligible.

advertisement

Motivation Background hBOA can solve nearly decomposable and hierarchical problems scalably, often in O(n2 ) evaluations or faster. O(n2 ) is sometimes not enough... We may need eﬃciency enhancemente techniques Parallelization, evaluation relaxation, hybridization, ... Parallelization of model building Among the most powerful eﬃciency enhancements. But modiﬁes the model building procedure. Speedups obtained were analyzed. But eﬀects of modifying the model building procedure on hBOA scalability were not analyzed thoroughly. Purpose Study the eﬀects of model-building parallelization on hBOA performance. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Outline 1. Hierarchical BOA (hBOA) basics. 2. Parallelization of model building in hBOA. 3. Experiments. 4. Summary and conclusions. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Hierarchical Bayesian Optimization Algorithm (hBOA) Hierarchical BOA (Pelikan & Goldberg; 2000, 2001) Build and sample a Bayes net instead of crossover/mutation. Use local structures and niching. Current Selected Bayesian New population population network population Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Bayesian Network Bayesian network has two parts Structure Structure determines edges in the network. X0 X5 X1 X4 X2 X3 Parameters. Parameters specify conditional probabilities of each variable given its parents (variables that this variable depends on). Example: p(X2 |X0 , X1 ). Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Eﬃciency Enhancement of hBOA Potential bottlenecks 1. Model building. 2. Evaluation of candidate solutions. Eﬃciency enhancement 1. Parallelization. 2. Hybridization. 3. Time continuation. 4. Fitness evaluation relaxation. 5. Prior knowledge utilization. 6. Incremental and sporadic model building. 7. Learning from experience. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Parallel hBOA What to parallelize? 1. Model building. 2. Fitness evaluation. Parallelizing evaluation Same situation as for other evolutionary algorithms. Use master-slave architecture to distribute evaluations. This work Focus on parallelization of model building. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Model Building: Basics Components of model building 1. Learn structure (dependencies). 2. Learn parameters (conditional probabilities). Learning structure Start with an empty network. Add one edge at a time (without creating cycles). Choose edges that improve the network the most. Learning parameters Compute frequencies from selected solutions. Estimate probabilities from computed frequencies. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Complexity Analysis of Model Building Notation Upper bound on the number of parents: k Population size: N Number of bits: n Analysis Learn structure: O(kn2 N ). Learn parameters: O(knN ). Bottom line Learning structure = most expensive part. Should parallelize the greedy algorithm for learning structure. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Parallelizing Structure Learning Master-slave approach (Ocenasek & Schwarz, 2000) 1. Master sends each node to a separate slave processor. 2. Master distributes the population to all slave processors. 3. Each slave processor determines parents for its node. 4. The master collects the results and learns parameters. Variations Each processor deals with multiple nodes. Parameters are also learned locally in slave processors. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Ordering Nodes to Avoid Cycles Motivation Master-slave approach looks great. But Bayesian network cannot contain cycles! Communication required after each new edge on any slave. This leads to a LOT of communication. Ordering nodes (Ocenasek & Schwarz, 2000) Order the nodes randomly before learning the structure. Only allow edges consistent with the ordering. No need to check for cycles. Each processor can deal with edges into a single node. Communication needed only before/after the learning. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Ordering Nodes: Example Consider ordering (X1 , X3 , X2 , X4 , X0 , X5 ) Consistent network Inconsistent network Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Example Speedups with Parallel hBOA Results from Ocenasek et al. (2004) Nearly linear speedups even for relatively small problems. No need for parallel computers with shared memory. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Eﬀects of Node Ordering Positive eﬀects Straightforward parallelization with little communication. Slightly faster model learning even on a sequential computer. Negative eﬀects Model structures are restricted. Models may become less accurate. Population size and number of generations might increase. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Description of Experiments Test functions Concatenated trap of order 3 and 5. Hierarchical trap. Ising spin glass (2D). Scalability analysis Number of function evaluations with respect to problem size. Parameters Population size from bisection to ensure 30/30 successful runs. Number of generations upper bounded by n. Window size in RTR: w = min{n, N/20}. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Results on dec-3 hBOA hBOA with ordered nodes Number of Evaluations 100000 10000 30 60 90 120 150 Problem Size hBOA with node ordering needs slightly more evaluations. But diﬀerences are relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Results on trap-5 1e+06 hBOA hBOA with ordered nodes Number of Evaluations 100000 10000 30 60 90 120 150 Problem Size Results are mixed (no approach consistently better). But diﬀerences are again relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Results on hTrap 1e+06 hBOA hBOA with ordered nodes Number of Evaluations 100000 10000 1000 27 81 243 Problem Size hBOA with node ordering in fact outperforms standard hBOA. But diﬀerences are again relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Results on 2D Ising Spin Glass hBOA hBOA with ordered nodes Number of Evaluations 1000 100 36 64 100 144 196 Problem Size hBOA with node ordering needs slightly more evaluations. But diﬀerences are relatively small. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Summary and Conclusions Summary Tested scalability of parallel model building in hBOA. Analyzed the results. Conclusions Parallelization leads to a slight increase in the number of function evaluations. But the negative eﬀects are negligible compared to the gains. Bottom line: If model building is the bottleneck, parallelize the model building! Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Acknowledgments Acknowledgments NSF; NSF CAREER grant ECS-0547013. University of Missouri; High Performance Computing Collaboratory sponsored by Information Technology Services; Research Award; Research Board. Martin Pelikan, James D. Laury, Jr. Parallelization of Model Building in hBOA and Scalability

Order or Not: Does Parallelization of Model Building in hBOA Affect Its Scalability? Martin Pelikan Missouri Estimation of Distribution Algorithms

Read more

Order or Not: Does Parallelization of Model Building in hBOA Affect Its Scalability? (2006)

Read more

Order or not: Does parallelization of model building in hBOA affect its scalability? on ResearchGate, the professional network for scientists.

Read more

... eﬃcient parallelization of model building, its ... Parallelization of Model Building in hBOA ... models does not aﬀect scalability of hBOA on ...

Read more

Order or not: does parallelization of model building in hBOA affect its scalability? Martin Pelikan, James D. Laury; GECCO; 2007; External Link; Cite; Save

Read more

Order or Not: Does Parallelization of Model Building in hBOA Aﬀect Its Scalability?Martin Pelikan and James D. Laury, Jr. MEDAL Report No. 200...

Read more

CiteSeerX - Scientific documents that cite the following paper: Order or Not: Does Parallelization of Model Building in hBOA Affect Its Scalability?

Read more

Order or not: does parallelization of model building in hBOA affect its scalability. ... does parallelization of model building in hBOA affect its scalability.

Read more

Results. Found 33 publication ... Multiobjective hBOA, clustering, and scalability. GECCO : 2005: ... Effects of a deterministic hill climber on hBOA ...

Read more

## Add a comment