where the first columns has dates, the next column is bookname and the last column is the profit and loss for each book, i.e. the values that need to be addad.
Now, suddenly I do not get any numbers but just #VALUE! but if I change the range to a lot less, e.g. between Row B6 and B50, it works.
My questions is if this has to do with memory in Excel. Are there limits to how many formulas and how many names you can have before Excel crashes. I am aware that the 28 columns names are very large.
Are there any way of how to get around this, VBA-codes or other formulas that uses less power, not naming columns or what?
I have an excel sheet that has an onclose macro, and on certain computers i get an error message stating out of memory. can anyone explain why this is and how i could avoid it, by possibly clearing the memory first or something?
i have a spreadsheet on Excel 2000, which is 256 column's wide and 4000 long the only formulas are countif which start at row 1000-4000 and are copied to column 256. Just for the one sheet it is 14mb is this normal
I am currently using Excel 2003 to pull from 2 external spreadsheets to compile data. The first sheet (seniority list.xls) being referenced contains all of our employees (approximately 350 rows, but that can vary from 340 to 450) as well as some information (employee ID, full time or part time, etc.). The second sheet being referenced (pay.xls) contains a breakdown of all employees and their pay for a particular bi-weekly period.
What I'm looking to do is:
(1) Find out all of my current employees. I do this by referencing seniority list.xls.
(2) For each employee, find out how many hours they worked each week. I need to do this by referencing pay.xls. Unfortunately, that file works quite differently than seniority list.xls. Unlike the latter, which contains one employee number per row, pay.xls could contain as many or as few rows per employee as their schedule dictates. Each row contains things such as overtime, regular time, double time, alternate regular time, alternate OT, vacation, sick pay, etc. In total, pay.xls could contain as many as approximately 8000 rows. To determine how many hours they work, I'm using this CSE (array) formula:
(Note that the employee ID is in column A of the current sheet.)
I have a large amount of data on NBA teams and players that is extracted from external web pages. In order to have the data shown on separate sheets to do comparisons I have to use many array formulas and from what I have read that takes up a large amount of memory. I'm now at the point where I'm receiving a message that says "Excel cannot complete this task with available resources" when I open the workbook.
Is there any plausible way that I can resolve this situation? I have thought about converting the lookup formulas from arrays (index/match) to vlookups and hlookups.
I'm not as advanced with using Excel as most of you, and I realize I may be at a point to where I will need to use different software to analyze my data, but I would prefer to stick with Excel as it would be difficult to start from scratch and learn how to use a different program.
Regarding the Memory Leak problem with ADO, does anyone know of a workaround?
The workaround mentioned on Microsoft's website (http://support.microsoft.com/default...;en-us;Q319998) pertains to using ADO to retrieve a recordset from an Excel worksheet, not from an Access Database.
1) Does anyone know if those workarounds would also address retrieving a recordset from an Access Database?
2) If so, is the "SELECT INTO" workaround effective?
Is this Memory Leak issue relatively minor or only affecting Excel 2000? I'm guessing it must be one of those 2 things since the concensus is ADO is superior to MSQuery..
I'm using a Dell P4, 1.6Ghz, 512Mhz of memory (Win2K & Excel 2003), and it's bogging on spreadsheets that aren't that big and complicated (compared to thing's I did years ago, in the days of 386's and Win98)...
It hits the 100% CPU usage and stays there for minutes (especially when saving) on files that are only around 2M in size, yet it's not using much of the memory...
Spreadsheets large or small seem to have little effect on how much memory is being used... is Win2K allocating a fixed % of total memory for Excel to use?... can this be changed, so Excel gets to use more of the available memory?
I've been cleaning up and redesigning my spreadsheets to make them more efficient, but there seems to be a disconnect between my hardware specs and Excel's performance with files that aren't that big ~:-/
Disabling my antivirus software (Nortons) prior to saving, seems to speed it up a little, but this is not something I should be having to do...
One of my apps I run is very large and memory intensive, and there are several subsequent procedures which tend to either bog down or errors out before the program finishes running.
Is there a way to clear the memory out?
And adding more physical memory is not an option, I have 4gigs already and Excel can only utilize 1gig anyway.
I have tried Tools/Options/Security Password to modify, but it can still be saved as a copy to the network which becomes very confusing to many. If I use Tools/Potect Sheet, I need to re-enter the password for all sheets twice. I am trying have it so that everyone is read only, select people can edit (with same password). Password protect all sheets and the workbook with the same password everytime when the workbook is closed
Basically 2 questions
1) Is there a way of remembering a password so that when the workbook is closed it and all sheets are automatically protected with the same password every time
2) Is it possible to have it not copied unless it is opened by an editor
I have created an excel tool w/ A LOT of macros in it, and the tool "speaks" to the SQL server (downloads data from it and uploads it back). My big problem now is that every time I run any of the VBA code the amount of RAM that is getting allocated to Excel (2007) is increasing - I am checking it in the Task Manager. So if I work for a long time without closing down Excel - my entire RAM is getting consumed and the system slows down, and sometimes crushes.
I was wondering if there is a way in VBA to clean up memory that is used by Excel.
Do you have any hints? I am using extensively 3 things - screenupdating off; displaymessages off, and calculations off. But this speeds up Excel but doesn't reduce the memory usage.
I have a macro which filters data on one sheet and copies that data onto 34 new sheets . This works perfectly until the 25th pass when Excel advises that there is insufficient memory to continue. I assumed that this was connected to the clipboard and its ability to remember 24 chunks of data so I turned it off but still Excel gave up at the 25th pass. Is there a way that I can clear the clipboard during the macro's execution or is there something else going on?
1. The following is the VBA version of what i've been using:
Range("A1:J1000").Sort Key1:=Range("D2"), Order1:=xlAscending, Key2:= _ Range("E2"), Order2:=xlAscending, Key3:=Range("F2"), Order3:=xlDescending _ , Header:=xlGuess, OrderCustom:=1, MatchCase:=False, Orientation:= _ xlTopToBottom, DataOption1:=xlSortNormal, DataOption2:=xlSortNormal, _ DataOption3:=xlSortNormal now arrays are bigger than 1000 rows>> even bigger than 65536.
Such an array is stored in "arrBIG" and dimed as (1 to 10, 1 to 100000). The goal is to sort it by the 4'th, then 5th then 6th values in the first dimension - the one that is 1 to 10 - i call this the column.
I'm trying to import data into my sheet with formulae like this in row1(Columns A:I): ='F:Dir[File.xls]Sheet'!A2.
The files I'm importing the data from all have 50.000 rows and 9 columns, so I copy the formulae downward 50.000 times. This generates an error ('Not enough memory'), not surprisingly. I've also tried to do it in steps by way of VBA; first two thousand, copy and paste values, next two thousand etc. But that doesn't work either.
Does anyone know of a solution to this memory problem or a way to work around it (without opening the files with the data in them).
userform1 has a combobox1 (list of customers) with an commandbutton1 userform2 has many textboxes
when I click on the userform1.commandbutton1 - userform2 shows with textbox1.text being what was in userform1.combobox1
when i am done with userform2 and click close to go back to userform1, chose a different customer and click the commandbutton1, userform2 opens with the previous value.
how can i clear the previous?
currently when i close userform2 with commandbutton1 i use the following code
I have this memory game I want to set out on the computer. In table A, Row 2 (B2:E2) I (Moti) start off being the first player, where I place a series of different items. (In Example is given letters but can be kitchen article food and so…. Separated by coma) these Items are shown to player Jack, for a few seconds, so he can memorize them and Answer in (B3:E3) ....
I have recently just encountered a problem where whenever I try and run / edit any part of my user-form code i get the following error:
Compile Error:
Out of memory
The help document then lists about 12 things that could cause this.
My user form code is very small, however the form itself is large, it has a multipage with 12 tabs, each tab then contains between 50 and 100 labels which populate with data from the spreadsheet in the background.
Do you think this size could be causing the error? If not, does anyone have any solutions / ideas?
I got a CD with pricing and (high resolution) pictures (for printing). I've built the search/display macros for displaying a page on the net, so all I want to do is reduce the pics (Adobe Photoshop) to save memory - easy, right? Why can't I get the pics out of excel for work in Photoshop? I've done the tools>options>General tab>ignore other applications (clear) thing. I've done the "shift>edit>copy picture" or "right-click>copy" thing - but I can't paste anything anywhere. What's up? I didn't think I was this bad, but I guess I am; life is humbling.
I have windows XP and xl2k. I have a single- sheet workbook (9MB) linked to another multi-sheet workbook of databases (20MB). The single-sheet workbook has about 100 pictures (30k ea) that, when a number is entered, searches the database. If no match is found for a given picture and product info, then the picture is deleted and the rows are hidden (macros). Usually, about 10 products are found in the database for any one entry.
My goal is that I can "save as > web page" and email these results off to my customer. The problem is the after-the-search result page is about 1MB and so too is the .htm file that is created when I "save as > web page." I was thinking that 10 pictures at 30k should be around 300k plus the formatting (no?). I cleared the "tools>option>general>web options>general>save any additional hidden data..." to little effect. Is there a macro that will delete the hidden rows which I should run AFTER the product search (because deleting them instead of hiding them messes up the macro run) - I'm sure that would help (right?). But my original thought was that the .htm file size would not be affected by hidden rows.
The per-item section of excel is 375 cells + the picture with simple font and formatting and no other objects, etc... When I delete the picture and save as > web page just this one-item section area (375 cells), the .html file is 37kb in size. That seems large to me (yes/no?). (10 x 37K = 370K before pictures!). This may be a reasonable file size, I'm no expert. I read a post where xl2003 has a compression application - is that what I need, or is there a 3rd-party app that can help me reduce the size of this file in htm. Or, is this the best I can expect from this method and I should be exploring elsewhere (metatags?).
I was reading in the sticky post about the new version of Excel that is due out this year. One of the improvements listed was the increase of usable memory by Excel from 1gig to the amount of Windows memory. In reading this, am I to understand that increasing a machine's physical memory above 1gig is useless for Excel in the current version? My specific issue is I have a very involved worksheet that takes about 2.5 minutes to calculate and this will increase as more data is added, it is heavy formula laden. I was going to increase memory from 2gig to 4gig to assist in reducing the calculation time.
I have a workbook, with coding that opens many workbooks and performs several updates in each one, it seems that for some reason there is build up of memory with big files etc, is there a way with code that after each workbook has been updated and before the master workbook loads the next workbook for update that i can clear all memory, similar to closing excel down completly and reopening ?
I am working on a spreadsheet that is very large. Opening the spreadsheet and even operating the spreadsheet with the cells off of manual calc can slow down the computer quite a bit. The spreadsheet has quite a few sheets, alot of data, very basic formatting, and alot of vlookup (as well as other) formulas.
it crashes after one or two macro-plays and you are forced to restart Excel. With a 7MB file, this can cause some frustration due to the time it takes to load the file.
The tool is used to make graphs, and my code works with an umbrella macro that " loops" through a main macro depending on the number of graphs the user wants. From the error messages, I am guessing that the causes are either my userforms or my variables (I try to release them, but not sure if I am doing it right), but either way, I just want it to stop crashing.
Here is the code (I didn't include all of the side macros due to sheer length. I can if need be...):
Public comp As String Public mega As String Public custom As String Public total As String Public floored As String Public PPName As String Public SScreen As Boolean Public multiG As Boolean Public multipp As Boolean Public one As Boolean Public two As Boolean Public three As Boolean Public four As Boolean Public five As Boolean Public six As Boolean Public seven As Boolean Public eight As Boolean
I am at a point where I am running out of RAM for the 2BG allowed in 2007. Does anyone know if Microsoft has a later version of Excel that will allow me to grab onto 3GB? My file sizes are close to .5GB. It seems that I am always pushing the limit of what Microsoft can provide.
I am working on a Excel sheet of 156KB. Unfortunately it is increased to around 5MB. May i know the reason that may happened and how can i retrieve that.
I have written this macro which I run about 1000 times in a loop. It runs solver and copies the results to a row in my spreadsheet so i end up with a sheet of solver results.
The problem is that every time it runs it eats up memory and slows down. I timed it and the first 100 runs take about 2 minutes and the last 100 takes about 15 minutes, the ram use increases by about 4gb in that time too.
As this is the first VBA code I have written and I have been learning as I go along, I assume its something I have done wrong in the code.