Attempting To Speed Up The Time It Takes The Macro To Run?
Jan 8, 2013
I have recorded 7 different macros and then combined them all into one macro to achieve one end result. I am not sure if you can just look at the codes to determine different ways to improve them or if you need the excel spreadsheet as well.
I have a worksheet in which I have a worksheet_change macro. This worksheet_change macro makes sure that a few cells will keep their colors, even if the user copies and pastes a new value to that cell. This worksheet_change macro runs each time there is a change on the worksheet. Now my problem is that on the same sheet I have an update list macro which updates around 20.000 rows and two columns (which is alltogether around 40.000 values) and it takes a while to run. So.. it takes a loooooooooot of time (too much) when these two macros both run.
My question is that can I somehow disable the worksheet_change macro while the update list macro runs. I mean something like when I start the update list macro to disable worksheet_change macro and when the update list macro finishes, then reenable worksheet_change macro?
I've done quite a bit of searching in the forum and online and haven't found anything that's generic and can be used at anytime.
What I'm looking for is a way or for code that tells you how long it takes a macro to run from start to finish, something that can be used to time any macro. I've seen some threads in the forum where people indicate that it took x amount of seconds for their macro to run but not sure how to do it.
I have a macro that needs a major improvement in performance. I have a Quad Core 2.67GHz single processor computer and with the current logic it calculates 20 values per minute. I have data sets that can be up to 400,000 data points, which means it will take 333 hours. The attached Sample file has over 9000 point. For this data set it takes 7.5 hours.
The core logic of the macro is to extract what I call “Break Point” into column M and “Time Difference” between the Break Points to column O.
I got the code from this forum (thank you DonkeyOte) and made some modifications. The modified logic does the following:
1) The user inputs the starting cell. In the Sample I use G200.
2) From cell defined in button, It moves down one cell in that column and compares that value to the original cell.
If the value is greater than original cell then there are 2 possible outcomes:
a) move down 1 cell in column and if that value is less than original cell, then extract that value and copy that value to the column M (Break Point value) and copy the Time Difference Value to column O. “Time Difference” is calculated as the difference between the 2 point breaks in Column A. In this case I’ve hardcoded 0.003472222 to get 5 minutes which matches up with each incremental time in column A, but I would like the Macro to automatically calculate that by the difference between Column A values. b) move down 1 cell in column and if that value is greater than original cell, then move down again, until find lesser value than original cell. Once we find value less than original cell, extract value and copy value to column M and copy the Time Difference Value to column O.
Once lesser value has been copied to new cell, logic is now: a) move down 1 cell in column and if that value is greater than original cell, then extract that value and copy that value one cell to column M and copy the Time Difference Value to column O. b) move down 1 cell in column and if that value is less than original cell, then move up again, until find greater value than original cell. Once we find value greater than original cell, extract value and copy value one cell to column M and copy the Time Difference Value to column O.
At any point in the process if we find a value equal to the starting cell, we ignore it and the logic continues. The logic continues flip flopping like this to the last value of column G.
OR Here's the logic now in reverse I need:
If after moving down 1 cell of original cell, the value is less than original cell then logic has the following 2 outcomes:
a) move down 1 cell in column and if that value is greater than original cell, then extract that value and copy that value one cell to column M and copy the Time Difference Value to column O.. b) move down 1 cell in column and if that value less than original cell, then move up again, until find greater value than original cell. Once we find value greater than original cell, extract value and copy value one cell to column M and copy the Time Difference Value to column O.
Once greater value has been copied to new cell, logic is now: a) move down 1 cell in column and if that value is less than original cell, then extract that value and copy that value one cell to column M and copy the Time Difference Value to column O. b) move down 1 cell in column and if that value is greater than original cell, then move up again, until find lesser value than original cell. Once we find value lesser than original cell, extract value and copy value one cell to column M and copy the Time Difference Value to column O.
Again, at any point in the process if we find a value equal to the starting cell, we ignore it and the logic continues. The logic continues flip flopping like this to the last value of column.
Once all the Break Points and Time Difference between Break Points are extracted for each value in column G, the AVERAGE, STDEV and MAX values of column O are calculated in column Q, R and S.
There is a loop that controls the execution of the core logic until the last value in column G. In the macro I’ve hardcoded last row being 9171, but I’d like the Macro to figure out what the last row is automatically.
I know this description is a bit long but I’ve worked for many hours on getting it work properly. I just need some help to make it run much faster. I hope someone can help me out on this. I have the right logic, just need the speed now and I really can’t figure it out.
Attached is a sample files demonstrating showing how the logic to flips back and forth as moving down the column starting in cell G200.
The sample shows the results first 4 loops (Rows 200 to 203). The values in M and O are the results of the fourth loop.
Again, assistance is very much appreciated because I've taken this as far as I can with my limited experience.
EDIT - I can embed the Macro, but it'll probably much easier to actually see it in the sample file.
i have a workbook that is only 345kb in size it takes ages to open and although it has lookups and retrieves data from the web it just seems like it shouldnt be so slow
i also keep getting a message at the bottom like this:-
Excel takes about 10 minutes in the saving process. When I say 10 minutes, I mean, the excel screen freezes (says not responding) for about 10 minutes, then it actually saves at the very end in the normal time any other file would take as you watch the progress bar go forward.
I know many of the common answers and have tried. reducing the calculation time (which in turn reduces the saving time).
But in my circumstance, the calculation takes a very reasonable amount of time, and you see the progress % going forward.
- I would say I have about 2000 rows, and 15 columns. - They have sumifs formulas. - They link to a different workbook. - The workbook I am working on saves to the network - the source of my sumifs are also in the same folder on the network - the recalculation takes about 10 seconds at most - i have turned off recalculate before saving, it is all on manual calcs
- when i hit save, there are no calcs being performed - there are no macros in the workbook - there are only about 2 names in the name manager - then it freezes for about 10 minutes. - then the progress bar starts moving then it saves.
What is it doing in those 10 minutes?
1 more item to note, when I break the links to the workbook and thereby removing the sumifs formulas, its a snap.
Why does the existence of the sumifs extend saving time? I would completely understand if it elongated calculation time, but if calculation is off, then why does it even worry about it when saving?
I am using the below formula to distinctly count the number of customers that match the criteria that I have in Cells C7 and B10. The data is in a separate worksheet, that I am showing Named Detail of which will be changing on a monthly basis, so a pivot table does not want to be used. The detail data ranges from row 7-40,000, and the file is currently 8610KB's, and can potentially grow.
This formula works but takes an excessive amount of time for one caluclation, and I need this for multiple column and row critera. So, can this calculation be changed in order to get the same result with faster calculation time? I am using Excel 2003.
I have a large Excel 2007 file, around 60.000KB. 54.000KB are due to one of the worksheets where I have 8760rows x 160columns with data. The calculation time is not a problem, it is very fast, it only takes 2/3 seconds. The problem is when I open or save the file, it takes around 2 minutes... it is not too much, but it becomes too long when one has to open and save it several times. It there any trick to decrease the time when openning or saving an excel file??
1) Current Speed 2) Current Acceleration 3) Acceleration Growth
Assuming: Current Speed=0 Current Acceleration = 0.2 (each 'turn' the current speed will increase by this much) Acceleration Growth = 0.2 (each 'turn', the current acceleration will grow by this much)
This gives a current speed over a series of 'turns' as 0.0 + 0.2 = 0.2 0.2 + 0.4 = 0.6 0.6 + 0.8 = 1.4 1.4 + 1.0 = 2.4 2.4 + 1.2 = 3.6 3.6 + 1.4 = 5.0 5.0 + 1.6 = 6.6 6.6 + 1.8 = 8.4 8.4 + 2.0 = 10.4 etc.
What I'd like to do is have a formula (or some way other than calculating each step) to tell me how many turns it would take for the Current Speed to =>X (example 100)
Basically, Turns to X speed = something clever * acceleration growth * something else very clever.
i need to make a macro that takes workbooks or files and runs them through another macro. i already have the 2nd macro done and it is working perfectly i just need to know how to make the one that finds the other files and runs them all through the macro i already made. My boss said that he will have about 150-200 files to run through this macro.
I am trying to create a macro that can take the average of the the first 24 cells within a sheet, place the answer onto a cell in the next sheet (e.g. sheet2 in cell A1), then go back to the previous sheet, take the average of the next 24 cells within the sheet and paste the average of this new set in A2. I want to create a loop that will do this 365 times.
I have only managed to create the following code, however its only obtainning the average for the first set of 24 cells starting from B6 in sheet 1. I dont know how to use offsets that well....
VB: Sub Oval1_Click() For i = 1 To 365 Sheets("H1 - Riser Turret pressure").Select Range("B4").Select ActiveCell.FormulaR1C1 = "=AVERAGE(Sheet1!R[2]C:R[25]C)" Range("B4").Offset(1, 0).Select Next i End Sub
I have a fairly simple macro that takes a few seconds on my XP-computer with Excel 2003 but takes several minutes on my Vista-computer with Excel 2007.
The XP-PC has 2GB memory, the Vista-PC only 1GB, but it's hard to believe it's only that. Is Excel 2007 so much slower than 2003?
The macro makes quite extensive use of the .rows(Rownr).Delete method. Is the fact that 2007 has 1 million rows against the 65536 of 2003 the culprit? It has to shift much more data up when deleting a row, no?
The macro clears specific columns in a row when you click anywhere on the row and then hit the command button. It clears the first range and 2nd range in 2 distinct steps, and takes up to 3 seconds.
I am having a big problem with recalculating cells after running a macro. The macro runs at a great speed, but once the calculation is switched back to automatic, the cells are not calculating fast enough. I waited for 4 minutes and it still had not moved past 0%. The worksheet has approximately 9000 rows out to IG columns. 85% of the cells contain an if-then formula.
I have this analyze that is runned by a macro in one workbook, and it starts a analyzing-process in another workbook. The data is picked up in the no2 book, and returned to the first book. It is analyzing lots of workbooks, sometimes up to 1000 workbooks, it means that no2 workbook gets a new name and then saved.
I have once heard that the process could be way faster if the workbooks where the analyzes is processed through not were saved, and I actually dont need the books as long as I got the data into my first workbook.
But Im not sure what in the macro that makes it save the no2 workbook, but I would really like to speed up this process. As it is now I have to start the analyze before I go to bed, and the hopefully it's done when I wake up next morning.
I wrote a macro that works fine, although it runs extremely slowly as if plodding along through all of the cells one at a time. I'm sure the computer is faster than that, so I would like it to chug through more quickly.
I am using some syntax that could be optimized ....
I'm trying to determine the speed of a macro. I searched and have had no luck. recently with some help I reduced my macro speed from minutes to seconds and I was wondering is there code out there that I can record the speed of an existing macro.
I am dealing with data sets from various instruments that have different sample rates. I am deleting data points I don't need from some of the sets with higher sample rates so that all the data is on the same time scale.
The macro I have is super simple, but incredibly slow. I'm simply deleting every other cell down a column.
VB:
Sub OATcondense() Application.ScreenUpdating = False Do While ActiveCell <> "" ActiveCell.Offset(1, 0).Delete Shift:=xlUp ActiveCell.Offset(1, 0).Select Loop Application.ScreenUpdating = True End Sub
I've got a rather involved macro that's running kind of slowly, and I would appreciate any help I can get speeding it up. It's in two parts; the first is to create and email a report, the second is to format so it's pretty for printing. The full codes for both routines is pasted below.
The email part I developed first and it runs pretty quickly. Afterwards, I added the second macro, which is called halfway through the first.
Stepping through the code in the second macro, the problem I see is in this section, the setup for setting the heighth of merged cells in the report:
I run a simple macro loop to clean some data across nine columns. The purpose is to collapse the data in the columns so that column 1 has the first value found in that row, for the set of columns. For instance, if columns 1-4 are empty, it deletes / shifts everything left until the first column is not empty. Then it goes to the next row and repeats. Data can range from a few rows up to 6000.
[I have a period in the data as the cell content to evaluate]
Sub A_Rollup_collapse() StartT = Now
Dim Col As Integer Col = Range("IV1").End(xlToLeft).Column - 9 LastR = Range("A60000").End(xlUp).Row
Application.ScreenUpdating = False For R = 2 To LastR Do While Cells(R, Col) = "." Cells(R, Col).Delete Shift:=xlShiftToLeft Loop Next R EndT = Now Application.ScreenUpdating = True
this macro (B) runs after another macro (A) that populates the nine columns with data using vlookups. Macro (A) It builds out a chain of information from col 2 to col 9, converts to values etc. Nothing odd.
When macro (B) is called right after running macro (A), it can take about one minute for 500 rows of data.
When I save and close the workbook, reopen it and run macro (B), it only takes one second.
When I insert a ThisWorkbook.Save between the two call statments, macro (B) still takes over a minute.
A minute is not too bad but when I'm dealing with thousands of rows, the difference is more like 30 seconds vs. 9 minutes which is a problem.
It takes all numbers in column 4 starting with Row 15 and deletes all duplicates. It then shows the number of times the number was duplicated and puts this number in column 3. MY PROBLEM: The macro searches each line and takes FOREVER! I have data with thousands of lines. I already tried the screenupdating method which really doesn't help that much. Is there possibly a better code for doing this?
Sub Factor() Dim sID As String Dim sOldID As String Dim lLastRow As Long Dim lrow As Long Dim lcount As Long Dim lLoop As Long lLastRow = ActiveSheet. Cells(Rows.Count, 1).End(xlUp).Row lrow = 15 sID = ActiveSheet.Cells(lrow, 4).Value sOldID = "ActiveSheet.Cells(4, 15).Value" lcount = 1 lLoop = 1 Do While Len(sID) <> 0 If sID <> sOldID Then If lLoop = 1 Then.................................
i am currently using the following code to copy records from one sheet to new sheets that that are created and named in the first part of the if(). this works fine, however when i am dealing with 50,000 records it still takes 5-10min to get them all sorted. I think this although functional is horribly inefficiant, and am wondering how i might be able to speed up the process. possibly rather than check each record, then copy then paste individually to sort them, then read and select the list of cells until value changes, then copy over at once. i dont know if this would be faster or not, let me know what you think.
Sub autorec ()
Dim wSheet Dim newSheetName As Variant Dim FNAC As Double Dim OU As Double Dim DS As Double Dim CCY As String
Attached is a workbook that takes an imported text file, inserts rows with text.
It is SLOW, you can watch each line being inserted with the text, I would think with such a small sample dataset it would be much much quicker.
Information: See attached xls file.
On sheet1 is the imported data, sheet2 is a copy of the imported data so one can copy and paste to sheet1 as needed with out re-importing for test purposes and only for this query. (Sheets2 thru 3 are not used otherwise).
There is a command button on sheet one which will run macro "aaa". If you run this you will see how slow it is and exactly what it is doing.
I am simply looking for a way to speed this up, I have some files that are 10 times the size of the sample data and they take 10 or more minutes to run.
Im setting up a spreadsheet that does engineering calculations. Im using macros to run sizes from a standard schedule. It basically takes the values from one sheet (schedule) to another (calculation), then the result from the calculation sheet (Value only, not the link) is pasted back into the schedule. The macro seems very bulky and im sure that it can be made more efficient with a loop. here is a sample of the code from the macro;
I am trying to parse and remove unwanted rows from a very large text file using At the moment, the application runs rather slow and was wondering if the experts could give some pointers on how to make the code more efficient while still keeping it simple so others may be able to modify later. I am keeping the ScreenUpdating True as the alternative false will just show Excel as Not Responding to the user until the VB is finished.
Sub deleteReplaceRows() Application.ScreenUpdating = True Dim DeletedRows As Integer Dim lastRow As Long Dim Arr(7) Arr(1) = "<PUZZLE>" Arr(2) = "<%" Arr(3) = "%>" Arr(4) = "Response." Arr(5) = "</PUZZLE>" Arr(6) = "<HINT>" Arr(7) = "<MESSAGE>"
For i = 1 To 7 Do Set rng = Columns(1).Find(Arr(i)) If rng Is Nothing Then Exit Do rng.EntireRow.Delete DeletedRows = DeletedRows + 1...................
I have a range of cells ( C2, C5, C8:N1007, P8:P1007 ....) on a worksheet that I want to be able to clear the contents of through a macro - worksheet is entitled Database
I have 3 additional worksheets all of which have summary information on and some of the formulas are huge.
I've inserted a module and produced this code;
Sub ClearContents() Application.Calculation = xlCalculationManual Range("C2, C5, C8:N1007, P8:P1007, ....").ClearContents Application.Calculation = xlCalculationAutomatic End Sub
Because of all the 3 summary page formulas the code is taking a few seconds to run.
I'm new to VBA and just wondered whether I could switch the calculation setting for the entire workbook onto manual at the beginning of the code and then switch it back to automatic at the end of the code? Think this might speed things up.