0% found this document useful (0 votes)
11 views

Information About The Course Work: Tutorial 2, 3

The document discusses the hill climbing algorithm and how randomness can help agents escape local optima. It introduces the hill climbing program which simulates agents moving on landscapes with walls that block access to the global optimum. Randomness is introduced through a random number check that allows agents to occasionally change direction instead of always moving uphill. The reader is asked to run experiments with different randomness levels and world complexities to observe its effects on completion time.

Uploaded by

Bhumit Patel
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Information About The Course Work: Tutorial 2, 3

The document discusses the hill climbing algorithm and how randomness can help agents escape local optima. It introduces the hill climbing program which simulates agents moving on landscapes with walls that block access to the global optimum. Randomness is introduced through a random number check that allows agents to occasionally change direction instead of always moving uphill. The reader is asked to run experiments with different randomness levels and world complexities to observe its effects on completion time.

Uploaded by

Bhumit Patel
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 20

TUTORIAL 2, 3 Information about the Course Work Hill Climbing Program Moving along a gradient of increasing intensity.

Could be a real hill (steepness: slope) but also any other gradient
Examples (from nature): moving into the direction of increasing scent (smell), light (sight), sound, temperature, concentration of a chemical (computational): moving towards lowest energy (physics), minimum of a function (mathematics)

Problem:
On the way, procedure might get stuck at a local optimum (minimum)

In the Hill-Climbing program this problem is illustrated by a wall (river) that blocks access to the hill-top (where the gradient is at maximum intensity)

How to escape local mininima / optima?


(How to get around the wall?)

MAKE AN UNEXPECTED JUMP What is UNEXPECTED? By Chance

Generating random numbers can be used in optimization procedures* to simulate chance events that kick the system out of a local minimum.
In the Hill-Climbing program this is done by the command: ifelse (random 100) < randomness [set heading random 160][find-best-direction]

In the course work you are asked to experiment with the effect of randomness as well as with the structure of the environment

* A procedure for doing this is the so-called Boltzmann Machine. Another famous optimization algorithm based on random shocks is Simulated Annealing

The amount of randomness can be manipulated by the slider randomness

Run experiments with different values of randomness (i.e. 0, 25, 50, 75%)

The environment can be changed by drawing walls of increasing degree of complexity. Create a set of worlds characterised by different types of walls

WORLD 1

WORLD 2

WORLD 3

WORLD 4

WORLD 5

For each world, run the program for different degrees of randomness and repeat this five times. For each run, write down the time to completion (ticks)

5. After each run: measure the ticks until turtles do not move any more 6. After each run reposition the turtles 1. Click Setup 4. Click Go to run the program 2. Click Draw

3. Draw the walls

WHAT IS RANDOMNESS?
Related to Chance events (by coincidence)
Events that are not 100% predictable: they do not always take place at the same time and do not always have the same effect

Chance events occur with a certain probability instead of certainty


Probabilistic events are the result of stochastic processes (as opposed to deterministic processes)

The probabilistic nature of stochastic processes is reflected in the variability of measurements (sometimes called error) In statistics it is assumed that the source of this variability is the summed effect of a very large number of unknown/ un-controllable factors. This leads to unpredictability
Example: the outcome of rolling a die may be affected by temperature, height of rolling, force of rolling, friction, surface of substrate, ..., and hundreds of other things

Examples of stochastic processes


Tossing a coin You cant be sure if the coin comes up with a head or a tail (there is a 50% chance that either of these events will occur Rolling a die Six outcomes are possible, but which one will be realised is a matter of chance
Random Variable: Function that connects a number to the outcome of a chance process. Examples: When coin shows up with a head = 1, else = 0 The number of eyes (1, 2, 3, 4, 5 or 6) after rolling a die

Due to a whole lot of unknown and un-controllable factors, some times a die comes up with a 1, and other times with a 2, 3, 4, 5 or a 6 It is like picking blindly (a-selectively) one out of six numbers labelled as 1, 2, 3, 4, 5 or 6. This also called a random selection , or choosing at random a number between 1 and 6. In computers, this stochastic process is simulated by a mathematical procedure called random number generation.
Example: In Excel the outcomes of rolling a die are simulated by the function
=RANDBETWEEN(1,6). In NetLogo it is done by statements like random 6

Random Number Generation in Excel


Simulating rolling a die
Excel function =RANDBETWEEN(1,6)
Result 1 6 4 1 6 6 1 6 1 6 2 1 1 4 6 3 4 5 5 3

run 1 run 2 run 3 run 4 run 5 run 6 run 7 run 8 run 9 run 10 run 11 run 12 run 13 run 14 run 15 run 16 run 17 run 18 run 19 run 20

(Frequency) Distribution of the outcomes


X = #Eyes 1 2 3 4 5 6

Probability distribution

Frequency, F(X) Probability = F(X)/N 6 30.00% 1 5.00% 2 10.00% 3 15.00% 2 10.00% 6 30.00% S 20

Bar plot of the frequency distribution

Histogram (20 runs)


8 6 4 2 0 1 2 3 #Eyes 4 5 6

Frequency

Formal definition of probability


How often an event occurs Possible number of events
Examples (Theoretical probability) The probability that a coin comes up with a head = = 50%

The probability that a die after rolling comes up with a particular number is p(X = 1) = p(X = 2) = = p(X = 6) =

1 6

Estimated probability (from a sample):


f X p X N

= relative (normalized) frequency


sample size = number of events

Theoretically, each side of a die has the same probability of coming up (1/6).
In practice you will rarely find that , after rolling a die a few times, all six sides show up with the same frequency. This is only approached after doing a very large number of trials!
Histogram (120 runs)
30

Frequency

20 10 0 1 2 3 4 #Eyes 5 6

Probability Distributions
Uniform Distribution
100 90 80 70 60 50 40 30 20 10 0

Frequency

Uniform distribution: Distribution of which the values have an equal probability of occurrence

-0.9 -0.7 -0.5 -0.3 -0.1 0.1


Bin

0.3

0.5

0.7

0.9

There are many other types of distributions (Normal, Exponential, Gamma)

Normal Distribution
250 200

150 100 50 0

Normal distribution: Distribution of which the values in the middle of the range (the norm) occur most frequently but extremes are increasingly rare

Frequency

-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 Bin

Randomness in Measurements (NOT as a tool to get out of local minima!)


Error is seen as undesirable by data analysts, as it precludes exact measurements and precise predictions. It implies that an average value is does NOT suffice on its own as a summary of a set of observations! Some observations may be equal to the mean, but many are not. If you report a mean value to summarize measurements, it should be accompanied by a measure for the overall deviation from the mean

This measure is known as the variance. Often its square root, the standard deviation is used

The Mean (= average) of a set of values (vi) is defined as:

N The Variance is the sum of the squared deviations from the mean corrected for sample size, N:

v
i 1

s2

v
i 1

SS = Sum of Squares df = Degrees of Freedom

N 1

The Standard Deviation is the square root of the variance

s s2

v
i 1

N 1

A related measurement is the Standard Error (SE):

s SE N

v(i) i = 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. N = 16.

A 1 2 2 3 3 3 4 4 4 4 5 5 5 6 6 7

B 1 2 3 3 3 4 4 4 4 4 4 5 5 5 6 7

C 2 3 3 4 4 4 4 4 4 4 4 4 4 5 5 6

Different distributions may have the same mean! Dispersion: in some distributions values are stronger clustered around the mean than in others
11 10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 4 Vj 5 6 7 8 A B C

v= 4

Parameters: Mean Variance

Mean is the same for all 3 distributions!

s2= 2.67 2.13 0.80

Small variance: data grouped around the mean, narrow distribution, reliable mean Large variance: broad distribution, mean is unreliable
20

You might also like