{"id":9410,"date":"2019-10-25T16:14:59","date_gmt":"2019-10-25T19:14:59","guid":{"rendered":"http:\/\/blog.plataformatec.com.br\/?p=9410"},"modified":"2019-11-18T15:17:29","modified_gmt":"2019-11-18T17:17:29","slug":"monte-carlo-in-practice-finding-the-ideal-iteration-value","status":"publish","type":"post","link":"http:\/\/blog.plataformatec.com.br\/2019\/10\/monte-carlo-in-practice-finding-the-ideal-iteration-value\/","title":{"rendered":"Monte Carlo in Practice: Finding the ideal iteration value"},"content":{"rendered":"\n
One of the reasons to use any kind of project management methodology is to reduce costs.<\/p>\n\n\n\n
A delay in a single week of a project creates two different cost types:<\/p>\n\n\n\n
Since these costs can be high, using tools that provide more visibility on delivery dates is necessary. One tool that can help solve this problem is the Gantt Chart. With the Gantt Chart it\u2019s possible to identify the most frail points of the project (the critical path) where more effort should be made to prevent delays, since a delay in any step of those points will impact the project as a whole.<\/p>\n\n\n\n
Even though this tool is great for some types of projects (normally when there is a low uncertainty level), it\u2019s not ideal for projects where the predictability is low due to the variance<\/a> of the deliverables, as in projects that happen in the field of knowledge (writing a book or software development, for example).<\/p>\n\n\n\n Here at Plataformatec, we have always done our best to make our delivery predictions based on data and ensure they follow proven scientific methodology. The two methods we use the most are:<\/p>\n\n\n\n In linear progression, we do data analysis on the delivered work items during some time period (in software development we normally work with weeks) and through the study of this information we come up with the best values for the progression. For example: suppose we have the following historic values of work items by week, as shown below:<\/p>\n\n\n\n A value that could be used as a pessimistic<\/strong> projection would be to assume a delivery of 1<\/strong> work item per week, seeing as we\u2019ve had 5 cases where the throughput was 1<\/strong> or 0<\/strong>. For the optimist<\/strong> projection we could use either the value of 3<\/strong> or 4<\/strong>, or even 3.5<\/strong>, as we\u2019ve had 4 instances where the throughput was 3 or higher<\/strong>. Lastly, for the likely<\/strong> projection, a good value would be 2<\/strong>, as it\u2019s the median<\/strong> and the mode<\/strong> of this dataset.<\/p>\n\n\n\n The biggest problem with the linear progression is that we are inputting the values for each scenario, something that can be dangerous if the person operating the linear progression doesn\u2019t have a good grasp of data analytics. Furthermore, linear progression ignores variance. Thus, in work systems with high variance of deliveries, the linear progression tool might not be the best approach.<\/p>\n\n\n\n Another technique we can use is called Monte Carlo<\/a><\/strong>, which runs random picks from our historic throughput data to try and find the likelihood of delivery.<\/p>\n\n\n\n I\u2019m having success using this method, however a question that keeps coming to my mind is: \u201cHow many interactions are necessary to make the method statistically valuable and the results reliable?\u201d. Looking for an answer to this question, I\u2019ve researched some statistics books, and also the internet, but wasn\u2019t able to find any information that would help.<\/p>\n\n\n\n So I decided to run a study on this subject. Monte Carlo\u2019s objective is to infer the probability of an event, \u201cforcing\u201d it to happen so many times, in order to use the law of large numbers<\/a> in our favor. For example, if you flip a coin a thousand times, count how many times it falls as \u201cheads\u201d, and divide by a thousand, you will get an inference of the probability of getting \u201cheads\u201d.<\/p>\n\n\n\n To check the quality of the method, I first ran tests over probabilities for which I already knew the expected results. After that, I started increasing the complexity of the problem, in order to see if there was any correlation between the difficulty of a problem and the number of interactions necessary for the inferred value from Monte Carlo to be as close as possible to the calculated one.<\/p>\n\n\n\n The objective of the tests is to see how many interactions are necessary for the inferred value to be close enough to the calculated value that the gain of running more interactions would be too low to compensate the computer power necessary to run it (for a project probability standpoint an error in the first or second decimal values would be acceptable). I\u2019d like to highlight that this blog post is not a scientific study of the subject, and the objective is to understand through inference a good value for project predictions.<\/p>\n\n\n\n The tests were performed using the programing language R<\/strong>, which can be downloaded in this link<\/a>. If you would rather use an IDE, I recommend using RStudio<\/a>.<\/p>\n\n\n\n In each test, I\u2019ve run the algorithm 100<\/strong> times using the following interaction values: 100<\/strong> (one hundred), 1000<\/strong> (one thousand), 10000<\/strong> (ten thousand), 100000<\/strong> (one hundred thousand), 1000000<\/strong> (one million), 10000000<\/strong> (ten million), 100000000<\/strong> (one hundred million) e 200000000<\/strong> (two hundred million). The results of these 100 “rounds” are then consolidated.<\/p>\n\n\n\n The first test that I\u2019ve run was the flipping of a single coin. In a coin flip, there is a 50%<\/strong> chance of it landing on any side, and this is the value we want the Monte Carlo to infer. For that I\u2019ve used the following R code:<\/p>\n\n\n Changing the values of the iterations and compiling the results, I\u2019ve got the following table:<\/p>\n\n\n\n In the instance of a simple problem, like the flipping of a single coin, we can see that a good iteration value could be:<\/p>\n\n\n\n With the coin test it was possible to infer the number of necessary interactions, based on the desired degree of reliability. However a question remains regarding if this behavior changes for more complex problems. Because of that, I\u2019ve run a similar test with a single six-sided die that has a 16,67%<\/strong> chance of landing on any side.<\/p>\n\n\n Changing the values of the iterations and compiling the results , I\u2019ve got the following table:<\/p>\n\n\n\n\n We can see that in this case, the iteration values are the same as for the instance of the coin toss, where a good value could be something between 10M<\/strong> and 100M if you need to have an error in the third decimal, between 100k and 1M for an error in the second decimal and between 1k and 10k for an error in the first decimal<\/strong>.<\/p>\n\n\n\n This test result shows that, either the complexity of the problem has no influence on the number of interactions or that both problems (coin toss and single die roll) have similar complexity levels.<\/p>\n\n\n\n Analyzing the execution times for both problems, it\u2019s possible to see that they took about the same time. This also corroborates with the hypothesis that they have similar complexity levels.<\/p>\n\n\n\n To test the hypothesis that the complexity of the above problems were too close, and this was the reason for the interaction values to be close, I\u2019ve decided to double the complexity of the die roll problem by rolling two<\/strong> dice instead of only one<\/strong> (it\u2019s important to note that doubling the quantity of functions doesn\u2019t always mean that the complexity order will double as well, as it depends on the Big-O Notation). The highest probability value for two dice is 7, with a 16,67%<\/strong> chance of occurrence.<\/p>\n\n\n Changing the values of the iterations and compiling the results, I\u2019ve got the following table:<\/p>\n\n\n\n Again we find the same interaction quantity for each error level. In the case of this problem it\u2019s possible to identify that it was more complex since the execution for 200M took on average 7.47 seconds longer<\/strong> when compared to the single die roll.<\/p>\n\n\n\n The inference that I\u2019ve built until this point is that the complexity of the problem influences very little in the iteration quantities, but to be sure of that, I ran one last test with 5 dice. The highest probability when rolling five dice is the values of 17<\/strong> and 18<\/strong>, with 10.03%<\/strong> probability each.<\/p>\n\n\n\n Since this problem is a lot more complex than the others I\u2019ve decided to run only 10 times each instead of 100.<\/p>\n\n\n Changing the values of the iterations and compiling the results, I\u2019ve got the following table:<\/p>\n\n\n\n Once again we can see that the values for the iterations are the same as above. It\u2019s possible to notice that this problem is clearly more complex, as it took 35 seconds longer<\/strong> when compared to the roll of a single die.<\/p>\n\n\n\n After running those tests, it\u2019s possible to conclude that for the use on project management, a good iteration value would be somewhere between 1k<\/strong> and 100M<\/strong>, depending on the error level as shown below:<\/p>\n\n\n\n The value that I\u2019ve been using is 3M, for stakeholder reports. However, when I\u2019m running the method for my own analysis, I use a lower value, somewhere between 250k and 750k.<\/p>\n\n\n\n How about you? Have you tried to run Monte Carlo Simulations? How many iterations are you using? Tell us in the comment section below or on Twitter @plataformatec.<\/p>\n","protected":false},"excerpt":{"rendered":" One of the reasons to use any kind of project management methodology is to reduce costs. A delay in a single week of a project creates two different cost types: The first is the cost of the team, since they will need to work another week. The second is the Cost of Delay, which is … \u00bb<\/a><\/p>\n","protected":false},"author":69,"featured_media":9413,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[1],"tags":[123],"aioseo_notices":[],"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"http:\/\/blog.plataformatec.com.br\/wp-content\/uploads\/2019\/10\/monte-carlo-in-practice-finding-the-ideal-iteration-value.gif","_links":{"self":[{"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/posts\/9410"}],"collection":[{"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/users\/69"}],"replies":[{"embeddable":true,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/comments?post=9410"}],"version-history":[{"count":24,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/posts\/9410\/revisions"}],"predecessor-version":[{"id":9551,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/posts\/9410\/revisions\/9551"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/media\/9413"}],"wp:attachment":[{"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/media?parent=9410"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/categories?post=9410"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/blog.plataformatec.com.br\/wp-json\/wp\/v2\/tags?post=9410"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}LINEAR PROGRESSION<\/h2>\n\n\n\n
MONTE CARLO<\/h2>\n\n\n\n
THE TESTS<\/h2>\n\n\n\n
THE COIN<\/h3>\n\n\n\n
\nFLipCoin <- function<\/span>(iterations)<\/span>\n<\/span>{\n # Creates a vector with the possible values for a coin 1 = Heads and 0 = Tails<\/span>\n coin = c(0<\/span>,1<\/span>)\n result = 0<\/span>\n\n # Flips a number of coins equal to the interaction value and sum up all the times where the coin landed in \"Heads\"<\/span>\n result = sum(sample(coin, iterations, replace=T))\n\n # Turns the value in a percentage of the coin flips<\/span>\n result = (result\/iterations) * 100<\/span>\n\n return<\/span>(result)\n}\n\n# Initiates variables<\/span>\nresult_vector = 0<\/span>\ntime_vector = 0<\/span>\ncontrol = 0<\/span>\n\n# Controls the interaction quantity<\/span>\niterations = 100<\/span>\n\n# Alocates the percentual of \"Heads\" results in a 100 size vector and the execution time in another<\/span>\nwhile<\/span>(control != 100<\/span>)\n{\n start_time = Sys.time()\n result_vector = append(result_vector, FLipCoin(iterations)) \n finish_time = Sys.time()\n\n time = finish_time - start_time\n time_vector = append(time_vector, time) \n\n control = control + 1<\/span>\n}\n\n# Shows the percentual of \"Heads\"<\/span>\nresult_vector\n\n# Shows the execution times<\/span>\ntime_vector\n\n<\/code><\/div>Code language:<\/span> PHP<\/span> (<\/span>php<\/span>)<\/span><\/small><\/pre>\n\n\n
\n\n
\n\t \nIterations<\/th> Min result<\/th> Max result<\/th> Expected result<\/th> Average result<\/th> Average result - Expected result<\/th> Result's median<\/th> Result's median - Expected<\/th> result | Result's Standard deviation<\/th>\n<\/tr>\n<\/thead>\n \n\t 100<\/td> 33.00000<\/td> 67.00000<\/td> 50.00000 <\/td> 50.02000 <\/td> 0.02000 <\/td> 50.00000 <\/td> 0.00000 <\/td> 5.09368 <\/td>\n<\/tr>\n \n\t 1000<\/td> 43.60000<\/td> 56.20000 <\/td> 50.00000 <\/td> 49.99440 <\/td> 0.00560 <\/td> 50.00000 <\/td> 0.00000 <\/td> 1.59105 <\/td>\n<\/tr>\n \n\t 10000<\/td> 48.35000<\/td> 51.78000 <\/td> 50.00000 <\/td> 50.01263 <\/td> 0.01263 <\/td> 50.02000 <\/td> 0.02000 <\/td> 0.51001 <\/td>\n<\/tr>\n \n\t 100000<\/td> 49.58000<\/td> 51.78000 <\/td> 50.00000 <\/td> 49.99110 <\/td> 0.00890 <\/td> 49.99350 <\/td> 0.00650 <\/td> 0.15805 <\/td>\n<\/tr>\n \n\t 1000000 <\/td> 49.85170<\/td> 50.15090 <\/td> 50.00000 <\/td> 49.99883 <\/td> 0.00117 <\/td> 49.99785 <\/td> 0.00215 <\/td> 0.04811 <\/td>\n<\/tr>\n \n\t 10000000<\/td> 49.95433 <\/td> 50.05807 <\/td> 50.00000 <\/td> 50.00013 <\/td> 0.00013 <\/td> 50.00010 <\/td> 0.00010 <\/td> 0.01564 <\/td>\n<\/tr>\n \n\t 100000000<\/td> 49.98435 <\/td> 50.01637 <\/td> 50.00000 <\/td> 50.00004 <\/td> 0.00004 <\/td> 49.99989 <\/td> 0.00011 <\/td> 0.00516 <\/td>\n<\/tr>\n \n\t 200000000<\/td> 49.98890<\/td> 50.01195 <\/td> 50.00000 <\/td> 49.99981 <\/td> 0.00019 <\/td> 49.99987 <\/td> 0.00013 <\/td> 0.00345 <\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n <\/li><\/ul>\n\n\n\n ONE DICE<\/h3>\n\n\n\n
\nRollDie <- function<\/span>(iterations)<\/span>\n<\/span>{\n # Creates a vector with the possible values of the die<\/span>\n die<\/span> = 1<\/span>:6<\/span>\n result = 0<\/span>\n\n # Alocates the die roll results<\/span>\n result = sample(die<\/span>,iterations,replace=T)\n\n # Sum up every instace where the die landed on 6<\/span>\n result = sum(result == 6<\/span>)\n\n\u200b\n # Turns the value in a percentage of the die rolls<\/span>\n result = (result\/iterations) * 100<\/span>\n\n return<\/span>(result)\n}\n\n# Initiates variables<\/span>\nresult_vector = 0<\/span>\ntime_vector = 0<\/span>\ncontrol = 0<\/span>\n\n# Controls the interaction quantity<\/span>\niterations = 100<\/span>\n\n# Alocates the percentual of \"6\" results in a 100 size vector and the execution time in another<\/span>\nwhile<\/span>(control != 100<\/span>)\n{\n start_time = Sys.time()\n result_vector = append(result_vector, RollDie(iterations)) \n finish_time = Sys.time()\n\n\u200b\n time = finish_time - start_time\n time_vector = append(time_vector, time) \n\n control = control + 1<\/span>\n}\n\n# Shows the percentual of \"6\"<\/span>\nresult_vector\n\n# Shows the execution times<\/span>\ntime_vector\n\n<\/code><\/div>Code language:<\/span> PHP<\/span> (<\/span>php<\/span>)<\/span><\/small><\/pre>\n\n\n
\n\n
\n\t \nIterations<\/th> Min result<\/th> Max result<\/th> Expected result<\/th> Average result<\/th> Average result - Expected result<\/th> Result's median<\/th> Result's median - Expected<\/th> result | Result's Standard deviation<\/th>\n<\/tr>\n<\/thead>\n \n\t 100<\/td> 33.00000<\/td> 67.00000<\/td> 50.00000 <\/td> 50.02000 <\/td> 0.02000 <\/td> 50.00000 <\/td> 0.00000 <\/td> 5.09368 <\/td>\n<\/tr>\n \n\t 1000<\/td> 43.60000<\/td> 56.20000 <\/td> 50.00000 <\/td> 49.99440 <\/td> 0.00560 <\/td> 50.00000 <\/td> 0.00000 <\/td> 1.59105 <\/td>\n<\/tr>\n \n\t 10000<\/td> 48.35000<\/td> 51.78000 <\/td> 50.00000 <\/td> 50.01263 <\/td> 0.01263 <\/td> 50.02000 <\/td> 0.02000 <\/td> 0.51001 <\/td>\n<\/tr>\n \n\t 100000<\/td> 49.58000<\/td> 51.78000 <\/td> 50.00000 <\/td> 49.99110 <\/td> 0.00890 <\/td> 49.99350 <\/td> 0.00650 <\/td> 0.15805 <\/td>\n<\/tr>\n \n\t 1000000 <\/td> 49.85170<\/td> 50.15090 <\/td> 50.00000 <\/td> 49.99883 <\/td> 0.00117 <\/td> 49.99785 <\/td> 0.00215 <\/td> 0.04811 <\/td>\n<\/tr>\n \n\t 10000000<\/td> 49.95433 <\/td> 50.05807 <\/td> 50.00000 <\/td> 50.00013 <\/td> 0.00013 <\/td> 50.00010 <\/td> 0.00010 <\/td> 0.01564 <\/td>\n<\/tr>\n \n\t 100000000<\/td> 49.98435 <\/td> 50.01637 <\/td> 50.00000 <\/td> 50.00004 <\/td> 0.00004 <\/td> 49.99989 <\/td> 0.00011 <\/td> 0.00516 <\/td>\n<\/tr>\n \n\t 200000000<\/td> 49.98890<\/td> 50.01195 <\/td> 50.00000 <\/td> 49.99981 <\/td> 0.00019 <\/td> 49.99987 <\/td> 0.00013 <\/td> 0.00345 <\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n\n TWO DICE<\/h3>\n\n\n\n
RollsTwoDice <- function<\/span>(iterations)<\/span>\n<\/span>{\n # Creates a vector with the possible values of the die<\/span>\n die<\/span> = 1<\/span>:6<\/span>\n result = 0<\/span>\n\t\n # Alocates the die roll results<\/span>\n first_roll = sample(die<\/span>,iterations,replace=T)\n second_roll = sample(die<\/span>,iterations,replace=T)\n\n # Sum the vector values position-wise<\/span>\n result = first_roll + second_roll\n\n # Sum up every instace where the dice landed on 7<\/span>\n result = sum(result == 7<\/span>)\n\n # Turns the value in a percentage of the dice rolls<\/span>\n result = (result\/iterations) * 100<\/span>\n\n return<\/span>(result)\n}\n\n# Initiates variables<\/span>\nresult_vector = 0<\/span>\ntime_vector = 0<\/span>\ncontrol = 0<\/span>\n\n# Controls the interaction quantity<\/span>\niterations = 100<\/span>\n\n# Alocates the percentual of \"7\" results in a 100 size vector and the execution time in another<\/span>\nwhile<\/span>(control != 100<\/span>)\n{\n start_time = Sys.time()\n result_vector = append(result_vector, RollsTwoDice(iterations)) \n finish_time = Sys.time()\n\n time = finish_time - start_time\n time_vector = append(time_vector, time) \n\n control = control + 1<\/span>\n}\n\n# Shows the percentual of \"7\"<\/span>\nresult_vector\n\n# Shows the execution times<\/span>\ntime_vector\n<\/code><\/div>Code language:<\/span> PHP<\/span> (<\/span>php<\/span>)<\/span><\/small><\/pre>\n\n\n
\n\n
\n\t \nIterations<\/th> Min result<\/th> Max result<\/th> Expected result<\/th> Average result<\/th> Average result - Expected result<\/th> Result's median<\/th> Result's median - Expected<\/th> result | Result's Standard deviation<\/th>\n<\/tr>\n<\/thead>\n \n\t 100<\/td> 6.00000<\/td> 28.00000<\/td> 16.66667<\/td> 16.58100<\/td> -0.08567<\/td> 16.00000<\/td> -0.66667<\/td> 3.64372<\/td>\n<\/tr>\n \n\t 1000<\/td> 13.20000<\/td> 20.40000<\/td> 16.66667<\/td> 16.68500<\/td> 0.01833<\/td> 16.60000<\/td> -0.06667<\/td> 1.16949<\/td>\n<\/tr>\n \n\t 10000<\/td> 15.60000<\/td> 17.94000<\/td> 16.66667<\/td> 16.65962<\/td> -0.00705<\/td> 16.67000<\/td> 0.00333<\/td> 0.38072<\/td>\n<\/tr>\n \n\t 100000<\/td> 16.22200<\/td> 17.06200<\/td> 16.66667<\/td> 16.66109<\/td> -0.00557<\/td> 16.66500<\/td> -0.00167<\/td> 0.11356<\/td>\n<\/tr>\n \n\t 1000000 <\/td> 16.54960<\/td> 16.54960<\/td> 16.66667<\/td> 16.66616<\/td> -0.00050<\/td> 16.66790<\/td> 0.00123<\/td> 0.03650<\/td>\n<\/tr>\n \n\t 10000000<\/td> 16.63294<\/td> 16.70266<\/td> 16.66667<\/td> 16.66607<\/td> -0.00060<\/td> 16.66577<\/td> -0.00090<\/td> 0.01220<\/td>\n<\/tr>\n \n\t 100000000<\/td> 16.65592<\/td> 16.67948<\/td> 16.66667<\/td> 16.66670<\/td> 0.00004<\/td> 16.66679<\/td> 0.00012<\/td> 0.00372<\/td>\n<\/tr>\n \n\t 200000000<\/td> 16.65787<\/td> 16.67476<\/td> 16.66667<\/td> 16.66671<\/td> 0.00004<\/td> 16.66671<\/td> 0.00004<\/td> 0.00267<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n\n FIVE DICE<\/h3>\n\n\n\n
RollsFiveDice <- function<\/span>(iterations)<\/span>\n<\/span>{\n # Creates a vector with the possible values of the die<\/span>\n dado = 1<\/span>:6<\/span>\n result = 0<\/span>\n\t\n # Alocates the die roll results<\/span>\n first_roll = sample(dado,iterations,replace=T)\n second_roll = sample(dado,iterations,replace=T)\n terceira_rolagem = sample(dado,iterations,replace=T)\n quarta_rolagem = sample(dado,iterations,replace=T)\n quinta_rolagem = sample(dado,iterations,replace=T)\n\n # Sum the vector values position-wise<\/span>\n result = first_roll + second_roll + terceira_rolagem + quarta_rolagem + quinta_rolagem\n\n # Sum up every instace where the dice landed on 18<\/span>\n result = sum(result == 18<\/span>)\n\n # Turns the value in a percentage of the dice rolls<\/span>\n result = (result\/iterations) * 100<\/span>\n\t\n return<\/span>(result)\n}\n\n# Initiates variables<\/span>\nresult_vector = 0<\/span>\ntime_vector = 0<\/span>\ncontrol = 0<\/span>\n\n# Controls the interaction quantity<\/span>\niterations = 10<\/span>\n\n# Alocates the percentual of \"18\" results in a 100 size vector and the execution time in another<\/span>\nwhile<\/span>(control != 100<\/span>)\n{\n start_time = Sys.time()\n result_vector = append(result_vector, RollsFiveDice(iterations)) \n finish_time = Sys.time()\n\n time = finish_time - start_time\n time_vector = append(time_vector, time) \n\n control = control + 1<\/span>\n}\n\n# Shows the percentual of \"18\"<\/span>\nresult_vector\n\n# Shows the execution times<\/span>\ntime_vector\n<\/code><\/div>Code language:<\/span> PHP<\/span> (<\/span>php<\/span>)<\/span><\/small><\/pre>\n\n\n
\n\n
\n\t \nIterations<\/th> Min result<\/th> Max result<\/th> Expected result<\/th> Average result<\/th> Average result - Expected result<\/th> Result's median<\/th> Result's median - Expected result<\/th> Result's Standard deviation<\/th>\n<\/tr>\n<\/thead>\n \n\t 100<\/td> 5.00000<\/td> 31.00000<\/td> 16.66667<\/td> 16.37100<\/td> -0.29567<\/td> 16.00000<\/td> -0.66667<\/td> 3.77487<\/td>\n<\/tr>\n \n\t 1000<\/td> 13.40000<\/td> 21.80000<\/td> 16.66667<\/td> 16.64710<\/td> -0.01957<\/td> 16.60000<\/td> -0.06667<\/td> 1.18550<\/td>\n<\/tr>\n \n\t 10000<\/td> 15.33000<\/td> 17.77000<\/td> 16.66667<\/td> 16.68307<\/td> 0.01640<\/td> 16.68500<\/td> 0.01833<\/td> 0.38059<\/td>\n<\/tr>\n \n\t 100000<\/td> 16.23100<\/td> 17.06600<\/td> 16.66667<\/td> 16.66546<\/td> -0.00120<\/td> 16.66800<\/td> 0.00133<\/td> 0.11597<\/td>\n<\/tr>\n \n\t 1000000 <\/td> 16.54450<\/td> 16.81170<\/td> 16.66667<\/td> 16.66657<\/td> -0.00010<\/td> 16.66860<\/td> 0.00193<\/td> 0.03798<\/td>\n<\/tr>\n \n\t 10000000<\/td> 16.62888<\/td> 16.70282<\/td> 16.66667<\/td> 16.66683<\/td> 0.00016<\/td> 16.66705<\/td> 0.00038<\/td> 0.01155<\/td>\n<\/tr>\n \n\t 100000000<\/td> 16.65408<\/td> 16.67899<\/td> 16.66667<\/td> 16.66669<\/td> 0.00002<\/td> 16.66673<\/td> 0.00007<\/td> 0.00382<\/td>\n<\/tr>\n \n\t 200000000<\/td> 16.65881<\/td> 16.67455<\/td> 16.66667<\/td> 16.66675<\/td> 0.00009<\/td> 16.66682<\/td> 0.00015<\/td> 0.00258<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n\n\n\n\n CONCLUSION<\/h2>\n\n\n\n