Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

How to compute running time for my simulation loop?

def run_experiment():
    from notears import utils
    # utils.set_random_seed(1) this line cannot be used to ensure different outcomes in each round of the loop

    n, d, s0, graph_type, sem_type = 1000, 20, 20, 'ER', 'gauss'
    B_true = utils.simulate_dag(d, s0, graph_type)
    W_true = utils.simulate_parameter(B_true)
    np.savetxt('W_true.csv', W_true, delimiter=',')

    X = utils.simulate_linear_sem(W_true, n, sem_type)
    np.savetxt('X.csv', X, delimiter=',')

    W_est = notears_linear(X, lambda1=0.1, loss_type='l2')
    assert utils.is_dag(W_est)
    np.savetxt('W_est.csv', W_est, delimiter=',')
    acc = utils.count_accuracy(B_true, W_est != 0)
    print(acc)

if __name__ == '__main__':
    num_experiments = 3

    for _ in range(num_experiments):
        run_experiment()

Now I need to compute the running time of the whole process. How to do this? I don’t know where I should put the start time and end time.

>Solution :

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

You can use the timeit module:

from timeit import default_timer as timer
if __name__ == '__main__':
    num_experiments = 3
    start = timer()
    for _ in range(num_experiments):
        run_experiment()
    end = timer()
    print(end - start)
   

The previous code should be able to measure the total time it took to run the 3 experiments.

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading