Increase Speed
Jan 15, 2007to speeding up this little routine that deletes rows that have identical values in certain cells in the row above
Sub remo() ...
to speeding up this little routine that deletes rows that have identical values in certain cells in the row above
Sub remo() ...
I'm trying to automate a process where I get a list of checks and money orders purchased then manipulate it to make auditors happy. I need to eliminate all transactions less than $3000. The fly in the ointment, however, is NOT to eliminate daily transactions by the same purchaser that may be less than $3000, but when added together are greater than $3000.
I've created a helper column where I've inserted the formula ....
My macro works fine but I'm interested in seeing if I can speed it up. What I'm doing is starting at the bottom and comparing it with the row above and if they match in font color it will fill the top with orange and delete the bottom. This works but usually takes several minutes.
Public Sub ADMINCompareList()
Dim varTest1, varTest2
Dim lng As Long, i As Integer, iTest As Integer
Application. ScreenUpdating = False
Worksheets("ADMIN").Activate
For lng = ActiveSheet.UsedRange.Rows.Count To 2 Step -1
If Not Range("M" & lng).Font.Color <> Range("M" & lng - 1).Font.Color Then
Goto newrow
End If
varTest1 = Intersect(Range("J:W"), Rows(lng))
varTest2 = Intersect(Range("J:W"), Rows(lng - 1))
For i = 1 To 14....................................
I have a test due in the morning, and I really need this question answered ASAP, if anyone could. I need to create an IF formula for this situation: the standard Mhz is 500. Give 55.00 for that standard. But for every increase of of Mhz, give 25.00 per 100 increase.
View 2 Replies View RelatedI'm reading in a text file with stocks data in the following format:
JRV,Jervois SUM,D,20090807,000000,0.00600,0.00600,0.00500,0.00600,41370,0
JYC,Joyce SUM,D,19900102,000000,1.25410,1.25410,1.25410,1.25410,0,0
etc...
which then compares the three-letter code with a list in the following
format to create a new text file where data is only included if the code
is on the list. Since I added the range search to the code it runs VERY slow
AACAust A Foo
AAFAustral AfMat
AAMA1 MineralMat
AARAnglo AustMat
AAXAusenco LiCap
ABBAbb Grain Foo
ABCAdelaide BMat
etc...
How to speed up Autofiltering? Excel 2003 sp2 hangs up for 10 minutes after
I choose a record in the Autofilter.
I'm working on a financial reporting project that should be in Access but unfortunately it must be in Excel. Some of the formula are complex and I have a UDF to calculate these values. I added the line
Application.Volatile
to each UDF but when I change the current month in a dropdown box, the UDF's do not recalculate. The dropdown box sets a period number on one of the worksheets - this same value is passed to each UDF. I tried using this code in my dropdown box :
Sub DropDown4_Change()
Application.CalculateFull
End Sub
but the PC just hangs. I have hundreds (more likely thousands) of formula in the spreadsheet and the recalc is recalcing everything whereas I just want it to recalc the UDF's. I even changed all of the sumproduct formulae to array sum if formulae which sped things up - that is until I forced the full recalc on the drop down change event.
So my question is: is it possible to just recalc the UDF's on 3 worksheets when the user selects a different period in a dropdown box?
And a supplementary question : if {sum(if(...))} formula are faster than sumproduct formula, would a (well written) UDF perform faster than a {sum(if(...))} formula?
I will attach a sample of the data when I figure out how to. The original data takes about 8 or 9 second to delete the lines.
Code: ...
I required to do some Monte Carlo analysis for 1000000 simulation. I have managed to find some free code, however, the time it took to run 1000000 >30min. Is that normal? The code that it took the longest to run is following:
For i = 1 To number_of_trials
Application.Calculate
For j = 1 To number_of_formulas
runs(j, i) = sel. Cells(1, 1 + j)
Next j
Next i
Is there any way I can implove this code to make it run faster? I have already tried Application. ScreenUpdating = False
I have a time (1:08:31) that it took to travel 35km. How can I calculate the average speed of this competitors plus a number of others who recorded faster or slower time?
I managed to convert the time to seconds but when I load a simple formula to convert to KPH it never works.
The macro clears specific columns in a row when you click anywhere on the row and then hit the command button. It clears the first range and 2nd range in 2 distinct steps, and takes up to 3 seconds.
View 5 Replies View RelatedMy question is about the If-Else Construct.
I often write If-Else statements that require an action be taken only if something is true. If that something is false, no action is to be taken.
My question is, how do you code "no action".
The following is what i usually
I have 3 numbers:
1) Current Speed
2) Current Acceleration
3) Acceleration Growth
Assuming:
Current Speed=0
Current Acceleration = 0.2 (each 'turn' the current speed will increase by this much)
Acceleration Growth = 0.2 (each 'turn', the current acceleration will grow by this much)
This gives a current speed over a series of 'turns' as
0.0 + 0.2 = 0.2
0.2 + 0.4 = 0.6
0.6 + 0.8 = 1.4
1.4 + 1.0 = 2.4
2.4 + 1.2 = 3.6
3.6 + 1.4 = 5.0
5.0 + 1.6 = 6.6
6.6 + 1.8 = 8.4
8.4 + 2.0 = 10.4
etc.
What I'd like to do is have a formula (or some way other than calculating each step) to tell me how many turns it would take for the Current Speed to =>X (example 100)
Basically, Turns to X speed = something clever * acceleration growth * something else very clever.
how to speed up a shared excel worksheet?
View 4 Replies View RelatedA1 has some characters
this code will generate all possible words, that can be made using all characters
system
permutate and checkspelling: if OK then write to column B
example
A1: iftrs
results: first frits rifts
Option Explicit
Dim CurrentRow
Const col = 2
Sub correctly_spelled_permutations()
Dim InString As String
Dim CalcSet As Integer
InString = Range("A1")
If Len(InString) < 2 Then Exit Sub
With Application
.ScreenUpdating = False
CalcSet = .Calculation
.Calculation = xlCalculationManual
.EnableCancelKey = xlErrorHandler
.StatusBar = "searching valid combination"....................
I use the Application.ScreenUpdating = False all the time. Any there any other things like this which speed up macros?
I have a Frame on a UserForm. The Width of the frame depends on values given in TextBox.Dat1 and TextBox.Dat2.
Now I tried it with a width of 2200 with a scroll bar, but when I change the value in the Textboxes I need the Frame to be cleared. Now this is my question. I use "UserForm1.Frame1.Clear" but this can take up to 1 minute.
Is there anyway to speed this up?
i hv a excel file which have 200k row. recently i learn from this forum using VBA (excel macro code) for select some data for analysis. it take very long. any suggestion will be apprecaited.
i already increase my note book ram to 2GB. it still slow.
weekly i have a report that is generated that is over 5000 lines and 4 columns wide.
I currently copy and paste one page column to make it 16 columns across the page. Is there a function in excel to do this?
I am having a big problem with recalculating cells after running a macro. The macro runs at a great speed, but once the calculation is switched back to automatic, the cells are not calculating fast enough. I waited for 4 minutes and it still had not moved past 0%. The worksheet has approximately 9000 rows out to IG columns. 85% of the cells contain an if-then formula.
View 9 Replies View RelatedThe formula: ={IF(AND(ISNUMBER($A3);($A3-DAY($A3)+1)=F$2);$D3;IF(AND(F$2 > ($B3-DAY($B3));F$2 < DATE(YEAR($C3);MONTH($C3)+1;0));$D3/DATEDIF($B3-DAY($B3);DATE(YEAR($C3);MONTH($C3)+1;1);"m");0))}
I need to use this formula for over more than 30.000 rows and more than 50 columns. Is it possible to speed up the formula? Or maybe to handle this by a macro?
I currently have a macro set up to delete rows if a certain user selected value is not found in a certain column. It works fine if the sheet isn't overly large but the problem is we have some spreadsheets with 25,000+ rows and it takes time to loop through - I'm not sure if there is even a way to make it faster.
Currently it works by looping backwards on the sheet checking each value in the cell versus an array containing the user selected values. If a match isnt found it deletes this row.
'y = long value representing row
'x = counter for each item in user selected array
'wsSheet = worksheet we are using
'rowLast = last row on spreadsheet
'arrSearch() = string containing user selected items
i have screenupdating/calculations/events turned off - i just didnt know if there was a faster way of looping through
i just thought about instead of doing an array maybe joining the array into 1 string and using a "like" comparison check to see if a match is found - would save me from having to constantly loop x * y amount of times - not sure if this would work or not. ill post back if it doesn't
I have a rather large spreadsheet that takes a very long time to calculate once the new data is added. One of the many things I need to do is look to see if a unique value in range 1 is also in range 2. If it is, return some data (vlookup), if it's not, then I want a "0", not #NA. My question is, which of these two methods will result in a faster calculation (if at all):
Option 1: Do it in one step
=IF(ISNA(VLOOKUP($A3,LY,3,FALSE)),0,VLOOKUP($A3,LY,3,FALSE))
Or option 2: Do it in two steps:
Column N formula:
=VLOOKUP($A3,LY,3,FALSE)
Column N+1 formula:
=IF(ISNA(Column N value),0,=column N value)
I have many columns of data using formulas in option 1, so if I have coded this badly that could be my problem...
I have 40000 rows filling with data from column a to g, autofilter is alway on on said columns , the problem is that when I select any column to filter data as per my query , it takes long time to display result and sometimes it hangs with the msg xyz workbook is not responding.
View 9 Replies View RelatedI have this analyze that is runned by a macro in one workbook, and it starts a analyzing-process in another workbook. The data is picked up in the no2 book, and returned to the first book. It is analyzing lots of workbooks, sometimes up to 1000 workbooks, it means that no2 workbook gets a new name and then saved.
I have once heard that the process could be way faster if the workbooks where the analyzes is processed through not were saved, and I actually dont need the books as long as I got the data into my first workbook.
But Im not sure what in the macro that makes it save the no2 workbook, but I would really like to speed up this process. As it is now I have to start the analyze before I go to bed, and the hopefully it's done when I wake up next morning.
when the Size is say 5000 Rows but when i have more records in my sheet excel just times out and ends with an Error.
Note: I am using the statement below within the for loop
Sheets("MergedDataSheetQ").Range(rplRange.Offset(0, 0), rplRange.Offset(sizeData, noColumns + 2)).AutoFilter Field:=noColumns + 1, Criteria1:=SubSector
Is there any work around so that i only change the criteria inside the loop or something like, so that the i reduce the execution time for this statement?
I wrote a macro that works fine, although it runs extremely slowly as if plodding along through all of the cells one at a time. I'm sure the computer is faster than that, so I would like it to chug through more quickly.
I am using some syntax that could be optimized ....
I have written a macro which is sucessful in the sense that it does what I need. However, I ran it against my data for the first time today and it took forever. To give you an idea of the data size... my code had to loop through about 10,000 rows and move the ones that met the IF criteria to another sheet.
Can anyone provide any suggestions as to what might make my code faster? Should I take a different approach? I am definitly still wet behind the ears.
My code is below...
Dim count As Integer
count = 3
PeCount = 3
ActiveWorkbook.Worksheets("MyWorksheet").Activate
Do
If Trim(Sheets("MyWorksheet").Cells(count, 12).Value) = "1111111" Or _
Trim(Sheets("MyWorksheet").Cells(count, 12).Value) = "2222222" Or _
Trim(Sheets("MyWorksheet").Cells(count, 12).Value) = "3333333" Then
Sheets("MyWorksheet").Rows(count).Select
Selection.Cut
Sheets("YourWorksheet").Activate
Sheets("YourWorksheet").Rows(PeCount).Select
Selection.Insert Shift:=xlDown
Sheets("MyWorksheet").Activate
Sheets("MyWorksheet").Rows(count).Select
Selection.Delete Shift:=xlUp
PeCount = PeCount + 1
Else
count = count + 1
End If
Loop Until IsEmpty(Sheets("MyWorksheet").Cells(count, 2).Value)
End Sub
I have been lurking around for past month learning lots from MrExcel's wonderful web site. One of the many things I learned was how to improve my spreadsheets with Array Formulas, but today I ran into a problem on a new spreadsheet I'm building for work.
Here's the problem: I have 39 coworkers. For each coworker, I have 14 Array Formulas using SUMPRODUCT command with up to 5 "conditions", similar to this example:
{=SUMPRODUCT((user=$A$5)*(task=AO$3)*(DateChecked>0)*((Error="Error Removed")+(Error="Error Converted to an FYI")))}
Each condition such as "user" and "task" is a static named range of 5000 cells. This spreadsheet will hold one week's worth of my coworkers' work. This past week they have processed about 2500 items. To be safe, I doubled this number to determine the static named range size.
For each worker I have 56 columns (one for each possible task which a coworker can process).
So for each coworker, there will be 14 * 56 = 784 Array Formulas.
Currently my spreadsheet only has a single coworker defined, so I only have 784 Array Formulas, but it takes 35 seconds at 100% CPU Utilization when I press F9 (Calculate all formulas). Right now, I am running this on my Home PC (a 400 MHz PII PC with 256 Megs of RAM, OS is Win2000 at SP4 maint level and Excel 2002), but it is equally slow at work (1.7 GHz Celeron with 256 MB of RAM running Win2K SP4 and Excel 2K).
I haven't tested yet, but even if I assume a linear progression, with 39 coworkers I am thinking it is possible the amount of time for Excel to recalculate all the formulas will be 39 times longer than it is currently. This will be close to 22 minutes. That is a long time to wait! It will be even worse if my testing shows the amount of time Excel takes to evaluate the array formulas is exponential instead of linear...
...784*39 = 30,576 formulas...
I have been assigned to speed up vlookup calculations that reference several workbooks. The spreadsheets currently take a very long time due to the enormous amount of calculations.
A typical example of the vlookup formulas is on the production tab of the attached excel file. We use 12 different workbooks that refer to each other and up to about 10 worksheets in each workbook.
I've read a little bit about optimizing vlookup and using other methods, but I'm not sure how to apply it to my case where it refers to other workbooks. Any help you could provide in giving me a place to start would be greatly appreciated (i'm not even sure if excel formulas or vba code would be the best approach).