I created a new form for our field personnel to complete daily. They are telling me that over a few weeks the file grows in size. The file size is 335KB starting out end up in a few weeks being 3.5megs. Now they add on some other sheets and things which undoubatbly make the file size larger but I am wondering if there is anyway thet VBA could somehow make the file bigger as the open it each day over a few weeks? They are not adding in any VBA, they are only using the code which is provided, which is accounted for in the 335kb. It seems unlikely but i thought i would ask.
I have created a monster Excel file that contains loads of vlookups and is 95MB and extremely slow.
I guess that vlookup creates tables in the background, hence the file size. Does anyone have any ideas on how to decrease the file size while keep the vlookup functionality?
I have what may be a math question as much as an Excel question. See my attached spreadsheet example in which I grow the denominator of a ratio by a fixed percentage until it equals or exceeds the numerator. This results in the denominator being grown X number of times.
I have a workbook which has a file size much bigger than I would expect. To test it I inserted one blank worksheet and deleted everything else. I have removed a macro that was embedded in the workbook. I have removed all protection etc. I would now expect the file to be similar to a new workbook (circa 13k) but it is still 782kb even though there is absolutely nothing in it.
I built a workbook that is used by our account team to summarize account status. Each worksheet has several cells that use conditional formatting to call attention to the status reported in that cell. Each manager sends their worksheets to a Senior manager who then consolidates into a single workbook containing approximately 50 worksheets. Once consolidated, the file size has grown to over 12 Mb and this needs to be reduced to a more manageable number, preferable 4 Mb or less. Prior to buliding the new version of this workbook, the file size was less than 2 Mb.
I have removed the Macros and believe the size is due to the conditional formatting in each worksheet.
Once the worksheets are compiled, is there a way to remove the conditional formatting and save the existing format - Similar to a "Paste Values" function? The senior manager does not need to change cell value to the formatting of the worksheet he receives could be saved.
To be specific - one cell turns has no color if the value is 10 or less, is yellow if 11 - 15, and Red if 16+. Once sent to the manager, the variance in this cell is not needed; he only needs to see the color associated with the value when sent. Next week he will receive an updated worksheet and a new value will be entered. The conditional formatting is only needed by the individual preparing the original worksheet.
I am creating a statistics database for my personal use. I have been developing this for many, many months now, and I have learned a lot about Excel and its capabilities in the process. However, I am becoming concerned because my files occasionally crash Excel, and I am wondering whether this is because my files are getting too large. Typically, the crashes occur when the main file is calculating (but not always). It stops responding, and I have to force the files closed through Windows Task Manager.
Without getting into the specifics of how the files work (although we can do that if it is necessary), here are the names of the files and their current sizes:
I will have all three of these files open simultaneously when I am working on them, and my laptop has 4G of RAM. I do not have much (if any) extraneous formatting in the files -- I know how to remove that -- but there is quite a bit of conditional formatting in the files that I want to maintain. I am considering partitioning (separating) the tabs in the Database file into separate files.
5 worksheets 30 columns per worksheet Total of 1300 rows of data across the 5 tabs
The row heights are not uniform across all worksheets but when I change them all down to 65536 on each sheet to height of 12, the file reduces in size from 11mb to 3mb.
It concerns me as it is almost like data could be getting lost.
I have a workbook that permits the user to enter data in no sense of order. From the original I am working for, there could be over 1500 rows of data. I then created a new worksheet to accept the unordered data to be sorted by a customer and date. I then created 20+ new worksheets within the same workbook, that will accept via an IF statement, the data for each customer. The problem as I see it is as the formula runs, I need to delete the blank rows on each of the 20+ worksheets. This is fine and the customer is happy with the set up. My problem is the file size. With just the formulas/worksheets, it is over 62 MG.
Is there a different formula/function I can use that won't require so much space?
I created a new excel file with filtering and after i saved the file , i found that the size for this excel file is 19 MB, so is this normal size for an excel file or there is a problem in this file and i have to fix it
I don;t know if there is a way around this but we have a Excel workbook that holds around 14 worksheets taht is getting too large. It currently is around 2.5mb and we constantly use it as a template which is completed and emailed. Sometimes we have to email 4 of these all together and so teh email is too large, even zipping them has next to no effect on the size. The sheet that seems to causing the problem is one that has around 500 lines of formula across 20 columns. Most of these fields have conditional formatting and its this taht is boosting the size.
I am working on a spreadsheet that pulls data from an external pivot table.
The spreadsheet is 25 columns wide by 72 rows. I have 15 worksheets currently but need to add another 7. The current file size is 22MB and I keep getting a NO MEMORY error message.
I am not using any fancy formulae just references to an external pivot table.
I currently have an excel file with lots of formulas on quite a few pages and these are carried down for so many records - its sort of a template to generate information from/to certain sheets for many other purposes.
- is there any way to compress the file (without actually zipping it) as it must remain a working document?
My excel file has grown to 23Mb. It has around 10 worksheets and a similar number of userforms. Because the file is large, it slows down the VBA code when it encounters a save command. Is there any way I can improve this? Do userforms use alot of memory or is it the length of code? There are a few images on the userforms but the images are less than 1 Mb in total when all added together.
I am putting together a price list in excel and using images and hyperlinks to jazz it up a bit - what I am finding is that if I insert a 25KB GIF image, the excel file grows in size by over 100KB
why is this disproportionate growth happening and is there a way around it - I dont want my Price List growing too large but need the images....?
We are using an excel file, which doesn't contain more then 1MB of data and have few "count and sum" formulae. but size of file keeps fluctuating from 1MB to 500MB. some time the size of file would be 1MB and once you will refresh the window file size will go to 480MB or some random number.
because of this excel file is quite slow and very difficult to work.
I have a workbook that grows in size on every use. I import a text file, manipulate the data, manually export it and delete the data from the workbook. So why does it continue to grow in size? By manipulating I mean, add rows and text to the new rows. Attached is a copy of the workbook in native form (have not ran the macro) and a sample file to import, the excel file is 59kb in size and the test file is 9kb in size. I have seen my original file grow to over 500kb from repeating the same macro on the same file many times or running many different files for testing.
There is a CommandButton on Sheet1, it will open a userform. Click in the largest textbox and it will open a file dialog box in "C:". Place the supplied text file in "C:" or some other folder of your choice as you can navigate to any folder or drive from the dialog box. The macros that add lines and or text are "JumpToLabelName" (this runs first) and "InsertRows" which runs second. These two macros do all the work.
Take note: I store information in the Registry, "GeoMeasure" under "VB and VBA Program Settings" (Normal windows key entry). You can search for "GeoMeasure" and delete it if you decide to test run the file.
The bulk of the file (6MB+) is in one sheet which is a data store.
All headings are on row 1 and are duplicated across 4 times for ease of reading.
Columns A-D have the main data and comprise a total of 39751 lines and a lot of duplication (basically it is 30 items for 25 clients (identical setups) for 53 weeks.
Columns E-H is an average for 4 or 5 weeks (depending on the period) for each items for each client and takes up 9001 rows.
Columns I-L have the quarterly averages (i.e. average of 3 periods) and takes up 3001 rows.
Columns N-P have the data which is displayed on other sheets and takes up 750 rows.
Now, on columns B,F,J and N I have the name of the client in row 2 (for example) and then the formula "=A2" in row 3 and down. This is repeated for each client (29 formulas each).
Is there any way I can batter the size of this sheet down?
I am about to undertake a fairly large spreadsheet development project. One of the key requirements is that the size of the spreadsheet be kept to below 4mb, preferrably below 1mb. This is due to an unavoidable restriction.
Although the spreadsheet will not have large volumes of data, it will contain approximately 30 sheets. There are no graphics embedded in the sheet. I imagine there will be a reasonable amount of VBA and/or forumulas in the sheet. Each sheet will basically be a data-entry form with a number of formulas. No sheet will have more than 100 rows.
Without knowing the full details of the spreadsheet, it is difficult for you to give a definite answer as to the size of the sheet, so I am simply looking for some helpful tips in keeping the size down.
Do formulas take up more space than VBA code? When I format cells, should I format the entire row, or just the range? Do features like sheet and cell protection take up a lot of space?
I have a workbook that I am working on that seems to be having file size growth issues.
All it is is 2 worksheets.
Worksheet 1- Columns A-S & 63 rows. That worksheet alone is 7mb.
no macros..No VBA..No formulas. Just straight text data with a handfull of auto-filters. I took the auto filters off and saved to see if that was the issue and it did not make a difference.
Worksheet 2- Columns A-S & 35 rows. This worksheet is almost 3mb.
Now I can make a change (when my computer catches up) to 2 cells and sometimes it will jump in size by 1mb or more.
I have an excel file that is about 1,400 kB in size. This file is the backend for a complex model with many inputs, and I now want to build a front end for it using userforms and VBA. However, when i created the first userform, my excel filesize blew out to about 7,000 kB. When I added all of my userforms (about 16 of them because there are lots of inputs, organised into categories), the filesize grew to 77,000 kB.
I have tried everything to reduce file size, including clearing all unused cells, avoiding pivot tables of any sort, and have also exported then re-imported all of my VBA modules and userforms. But the situtation stays the same: no userforms or VBA and my filesize is about 1,400kB, add in all the userforms, and filesize is 77,000 kB. Is there a way to keep file size down with userforms, or is that just the way it is? Is the only way to keep file size down to have a single userform, all questions on that one form, perhaps with visible and invisible controls depending on the 'page' of questions being viewed?
I have a spread sheet with a macro attached to it, approximately 500 lines of code and 3 sheets approximately 30 rows by 30 columns. it recently shot up from 130 KB to 15,000KB. Do I need to rebuild this thing from scratch or is there a way of finding where the fat is? I have tried clearing all columns beyond my active table area and all rows below my active area.
I have a folder containing about 100,000 files and subfolders. Approximately 20% of the files are duplicates. I would like to create a list of all the files with some details so that I can sort through and identify which ones are duplicates, then create a batch file to delete them. How to get a file list with additional details, like size, modified date, etc, into an excel format?