i am currently using the following code to copy records from one sheet to new sheets that that are created and named in the first part of the if(). this works fine, however when i am dealing with 50,000 records it still takes 5-10min to get them all sorted. I think this although functional is horribly inefficiant, and am wondering how i might be able to speed up the process. possibly rather than check each record, then copy then paste individually to sort them, then read and select the list of cells until value changes, then copy over at once. i dont know if this would be faster or not, let me know what you think.
Sub autorec ()
Dim wSheet
Dim newSheetName As Variant Dim FNAC As Double
Dim OU As Double
Dim DS As Double
Dim CCY As String
my existing macro, as the run takes to much time to complete. (at least 20min) I've already tried several loops, but no one worked for me. Following situation: There are two excel files, entries in column 73 - 85 will be copied from WorkbookRust to the other workbook if the numer in column 5 is the same. Not every cell within this columns contains data, so the macro should automatically jump to the last entry in each of the above mentioned columns, instead of predefine the range as you see in the code below. After the data is copied to the other workbook, it will be filtered acc. to Sub FilterMain and then copied back to WorkbookRust. As already said, the whole thing works, just to lame.
Sub Allmacros() Dim WorkbookRust As String WorkbookRust = ActiveWorkbook.Name ChDir "C:Documents and Settings vogtMy DocumentsRüstplausch" Workbooks.Open Filename:= _ "C:Documents and Settings vogtMy DocumentsRüstplauschCH_Revenue_2008.xls" Sheets("Main_Overview").Select Windows(WorkbookRust).Activate Application.run ActiveWorkbook.Name & "!UpdateEntries" Application.run ActiveWorkbook.Name & "!FilterMain" 'not ask to overwrite existing file Application.DisplayAlerts = False Workbooks("CH_Revenue_2008.xls").Save Workbooks("CH_Revenue_2008.xls").Close End Sub
I have written a macro which is sucessful in the sense that it does what I need. However, I ran it against my data for the first time today and it took forever. To give you an idea of the data size... my code had to loop through about 10,000 rows and move the ones that met the IF criteria to another sheet.
Can anyone provide any suggestions as to what might make my code faster? Should I take a different approach? I am definitly still wet behind the ears.
My code is below...
Dim count As Integer count = 3 PeCount = 3
ActiveWorkbook.Worksheets("MyWorksheet").Activate
Do
If Trim(Sheets("MyWorksheet").Cells(count, 12).Value) = "1111111" Or _ Trim(Sheets("MyWorksheet").Cells(count, 12).Value) = "2222222" Or _ Trim(Sheets("MyWorksheet").Cells(count, 12).Value) = "3333333" Then
I have a workbook with many lookups, sumproducts, dynamic named ranges and cse formulas. How do I determine what is causing my workbook to be slow? Are there more efficient formula types that I can use?
I am trying to determine in code whether a file exists in a certain location. This seems to be very slow even in a folder which contains just one file. Is there any way to speed up this process? Three seconds seems like a long time especially given that I will have to loop through this Sub many times. The files I am looking for are CSV so I can't use msoFileTypeExcelWorkbooks unless I can modify what file extensions this looks for. I only started dabbling with VBA a few weeks ago so it is entirely possible I'm barking up the wrong tree and should be using another method to acheive my aims.
Sub CISORTEST() With Application.FileSearch .NewSearch .LookIn = "C:TEMP" .SearchSubFolders = False .Filename = "MYCSVFILENAME" .MatchTextExactly = True .FileType = msoFileTypeAllFiles
I have a large worksheet (Sheet1) containing approximately 15,000 records (15,000 rows x 21 columns). I need to search through all these records, and manually decide whether a record should be copied onto another worksheet (Sheet2) or not. The code I have written works very well (as far as I can tell), but it is extremely slow (the searching takes forever).
The code is as follows:
Sub SearchAndCopy() Dim SearchRow As Long Dim LastRow As Long Dim SearchColumn As Long Dim LastColumn As Long Dim CopyToRow As Long Dim SearchString As String Dim Found As Boolean Dim Response As Integer On Error Goto Err_Execute Found = False 'Fetch Search Term Sheets("Sheet2").Select SearchString = LCase( Range("F3").Text)......................
I have created a code for printing specific charts on a "constant" page.
First of all the page is the same every time. It contains the same graph's (although with various data each time) and the same placement.
So when i make the page with graphs i also add a button with an attached macro.
This macro then shows an Userform where the user can decide which graph he would like to printout. I wan't specific headers and footers on the page, so the only thing i could make work is by copying the chart, and then place it in a new Chartsheet, then setting the header and footer and printing it out in the end. The code looks as follows:
Application. ScreenUpdating = False On Error Resume Next total = bruger + Varm + aar + varig 'the sub is called from another sub, so these are giving a value on beforehand If bruger = 1 Then 'bruge pagesetup uden for chart object for at lave header og footer Worksheets(sag).ChartObjects(1).Copy Worksheets(sag).Paste End If
Now everything works just fine, except it is really long time about doing this.
By testing i found out that the copying a chart takes alot of time in vba-code? Why is that? If i do it manually I don't have to wait for anything.
This is just a question of optimizing the code... Because i can't understand why it takes so long time with the VBA code!
The macro clears specific columns in a row when you click anywhere on the row and then hit the command button. It clears the first range and 2nd range in 2 distinct steps, and takes up to 3 seconds.
I am having a big problem with recalculating cells after running a macro. The macro runs at a great speed, but once the calculation is switched back to automatic, the cells are not calculating fast enough. I waited for 4 minutes and it still had not moved past 0%. The worksheet has approximately 9000 rows out to IG columns. 85% of the cells contain an if-then formula.
I have this analyze that is runned by a macro in one workbook, and it starts a analyzing-process in another workbook. The data is picked up in the no2 book, and returned to the first book. It is analyzing lots of workbooks, sometimes up to 1000 workbooks, it means that no2 workbook gets a new name and then saved.
I have once heard that the process could be way faster if the workbooks where the analyzes is processed through not were saved, and I actually dont need the books as long as I got the data into my first workbook.
But Im not sure what in the macro that makes it save the no2 workbook, but I would really like to speed up this process. As it is now I have to start the analyze before I go to bed, and the hopefully it's done when I wake up next morning.
I wrote a macro that works fine, although it runs extremely slowly as if plodding along through all of the cells one at a time. I'm sure the computer is faster than that, so I would like it to chug through more quickly.
I am using some syntax that could be optimized ....
I'm trying to determine the speed of a macro. I searched and have had no luck. recently with some help I reduced my macro speed from minutes to seconds and I was wondering is there code out there that I can record the speed of an existing macro.
I am dealing with data sets from various instruments that have different sample rates. I am deleting data points I don't need from some of the sets with higher sample rates so that all the data is on the same time scale.
The macro I have is super simple, but incredibly slow. I'm simply deleting every other cell down a column.
VB:
Sub OATcondense() Application.ScreenUpdating = False Do While ActiveCell <> "" ActiveCell.Offset(1, 0).Delete Shift:=xlUp ActiveCell.Offset(1, 0).Select Loop Application.ScreenUpdating = True End Sub
I've got a rather involved macro that's running kind of slowly, and I would appreciate any help I can get speeding it up. It's in two parts; the first is to create and email a report, the second is to format so it's pretty for printing. The full codes for both routines is pasted below.
The email part I developed first and it runs pretty quickly. Afterwards, I added the second macro, which is called halfway through the first.
Stepping through the code in the second macro, the problem I see is in this section, the setup for setting the heighth of merged cells in the report:
I run a simple macro loop to clean some data across nine columns. The purpose is to collapse the data in the columns so that column 1 has the first value found in that row, for the set of columns. For instance, if columns 1-4 are empty, it deletes / shifts everything left until the first column is not empty. Then it goes to the next row and repeats. Data can range from a few rows up to 6000.
[I have a period in the data as the cell content to evaluate]
Sub A_Rollup_collapse() StartT = Now
Dim Col As Integer Col = Range("IV1").End(xlToLeft).Column - 9 LastR = Range("A60000").End(xlUp).Row
Application.ScreenUpdating = False For R = 2 To LastR Do While Cells(R, Col) = "." Cells(R, Col).Delete Shift:=xlShiftToLeft Loop Next R EndT = Now Application.ScreenUpdating = True
this macro (B) runs after another macro (A) that populates the nine columns with data using vlookups. Macro (A) It builds out a chain of information from col 2 to col 9, converts to values etc. Nothing odd.
When macro (B) is called right after running macro (A), it can take about one minute for 500 rows of data.
When I save and close the workbook, reopen it and run macro (B), it only takes one second.
When I insert a ThisWorkbook.Save between the two call statments, macro (B) still takes over a minute.
A minute is not too bad but when I'm dealing with thousands of rows, the difference is more like 30 seconds vs. 9 minutes which is a problem.
It takes all numbers in column 4 starting with Row 15 and deletes all duplicates. It then shows the number of times the number was duplicated and puts this number in column 3. MY PROBLEM: The macro searches each line and takes FOREVER! I have data with thousands of lines. I already tried the screenupdating method which really doesn't help that much. Is there possibly a better code for doing this?
Sub Factor() Dim sID As String Dim sOldID As String Dim lLastRow As Long Dim lrow As Long Dim lcount As Long Dim lLoop As Long lLastRow = ActiveSheet. Cells(Rows.Count, 1).End(xlUp).Row lrow = 15 sID = ActiveSheet.Cells(lrow, 4).Value sOldID = "ActiveSheet.Cells(4, 15).Value" lcount = 1 lLoop = 1 Do While Len(sID) <> 0 If sID <> sOldID Then If lLoop = 1 Then.................................
Below is the entire code that I am using. It is a simple routine which checks whether a part has started its release process or not, based upon dates. The code works and does what I want.
The problem that I have is that it is very slow, for example it takes 35 seconds to go through 530 lines items. In my (limeted) experiance, based on other VBA doodlings this slow.
I have the following macro in a worksheet...and it is running very slow. There are other macros in the worksheet and they all run very well. Any ideas by looking at this code why it would be so slow in running?
The code below runs on a spreadsheet that has approx 600 rows which INDEX and MATCH another spreadsheet which has approx 600 rows. I takes about 4 mins to run. Sub QC()
Need some alternative code that would speed the execution of this macro. My weak attempt runs noticeably SLOW.
Dim cell As Range For Each cell In Range("TPD") If IsNumeric(cell) And Not IsEmpty(cell) Then If cell.Value > 0 Then cell.EntireColumn.Hidden = False Else cell.EntireColumn.Hidden = True End If End If On Error Goto 0 Next End Sub
I have recorded 7 different macros and then combined them all into one macro to achieve one end result. I am not sure if you can just look at the codes to determine different ways to improve them or if you need the excel spreadsheet as well.
Attached is a workbook that takes an imported text file, inserts rows with text.
It is SLOW, you can watch each line being inserted with the text, I would think with such a small sample dataset it would be much much quicker.
Information: See attached xls file.
On sheet1 is the imported data, sheet2 is a copy of the imported data so one can copy and paste to sheet1 as needed with out re-importing for test purposes and only for this query. (Sheets2 thru 3 are not used otherwise).
There is a command button on sheet one which will run macro "aaa". If you run this you will see how slow it is and exactly what it is doing.
I am simply looking for a way to speed this up, I have some files that are 10 times the size of the sample data and they take 10 or more minutes to run.
Im setting up a spreadsheet that does engineering calculations. Im using macros to run sizes from a standard schedule. It basically takes the values from one sheet (schedule) to another (calculation), then the result from the calculation sheet (Value only, not the link) is pasted back into the schedule. The macro seems very bulky and im sure that it can be made more efficient with a loop. here is a sample of the code from the macro;