What Makes .exe File So Large
May 29, 2011
I am writing an application in VS2010 using VB.net.The application is relatively simple and I would expect the .exe to be less than 1 Meg.I have written a couple of applications that are quite a bit larger that are less than 2 Meg.I am compiling in "Release" mode.The file size is 29,127 KB (in debug mode it is just 29,167 KB)Where do I start looking to find why the .exe is so large?
View 2 Replies
ADVERTISEMENT
Dec 27, 2010
I just published this simple console application that is supposed to show a textbox with the value of a setting called "userID" with value 1001. This works like a charm. Now what I need is to change this value outside the editor, from notepad etc. When I open the application a lot in there is non-sence etc. but with a quick (ctrl + F) I found the value 1001, and changed this to some other integer.
I ran the application again, and it failed, didn't even give any userful error-message. At a point I tried just opening a newly published non-corrupted version of the application, didn't change anything, then saved from notepad, and it were also corrupted. It seems like notepad can't open some characters or something. Do I need to publish the application in some specific text-unicode language or something? I use vb.net for this
View 4 Replies
Jun 9, 2011
I want to write a program to do Markov chain, but my states are quite large. First of all I calculate all the transition probabilities and revenues for all states(1381860 total states), and store in a multidimensional array. Public RevArr(0 To 9, 0 To 750, 0 To 282) As Long
After that the iteration of markov chain should use these as inputs to calculate the steady-state probabilities. But when I try to run the main code I got this error.Exception of type 'System.OutOfMemoryException' was thrown.
The following is the declaration of second array I add just another dimension for storing all the iterations, but I get this error. Dim stateprob(IT + 1, 0 To 9, 0 To 750, 0 To 282) As single
View 1 Replies
Aug 26, 2011
this code takes about 30 mins and high cpu usage, what is the problem
Do
strLine = objReader.ReadLine()
If strLine Is Nothing Then
[code].....
View 3 Replies
Aug 26, 2011
this code takes about 60 mins and high cpu usage for a text file of 90,000 lines, what is the problem..[code]
View 9 Replies
Apr 4, 2012
The file is a UTF8 text files.
Each characters has varying number of bytes and each line has varying number of characters.
Does vb.net has table of line numbers to byte location function or something like that?
Also after that how to read that?
View 1 Replies
Dec 17, 2008
I have problem, how I can import HEX file to my program in VB .NET? And I have question if I can import, What size of file I can import? I have 200MB - 400MB HEX files?? Can I import so huge files?
View 18 Replies
Mar 16, 2010
I am trying to read this XML document.
An excerpt:
<datafile xmlns:xs="[URL]" xmlns:xsi="[URL]" xsi:noNamespaceSchemaLocation="wiitdb.xsd">
<WiiTDB version="20100217113738" games="2368"/>
<game name=" Wanted: 50 Wacky Jobs (DEMO) (USA) (EN)">
<id>DHKE18</id><type/>
<region>NTSC-U</region>
[Code] .....
It just skips the "While iter.MoveNext" part of the code. I tries it with a simple XML file, and it works fine.
View 1 Replies
Jan 5, 2012
I have a simple app that reads from a very large text file, and returns a value if a string is found. I can instruct users where to download the file, and where to put it, but it would be nice if I could embed the file with the publish, so that the program knew where to look by default.Getting users to download a seprate file is painful. This file has 1.4 milion lines of text. I really need it to look for the file in a predictable place and be able to run against that by default for most users. I can have experienced users browse for a new file, but most people aren't into that much thought.
View 8 Replies
Oct 14, 2009
I'm trying to load a large CSV file into a Microsoft SQL Server Compact 3.5 database. I've tried using the following: Using MyReader As New Microsoft.VisualBasic.FileIO.TextFieldParser("filename.txt") MyReader.TextFieldType = FileIO.FieldType.Delimited MyReader.SetDelimiters(",")Then splitting the data with MyReader.ReadFields() etc, before using this data to add rows to a dataset in a database table in my project. However, my CSV files are very large, at above 9.5 million rows, and this takes forever, if the computer doesn't crash also. Does anyone have a better idea for what to do? I would like the CSV file to be loaded into the database table, to enable me to sort it, and undertake some querys and maths. The CSV data structure is:2,193,761.40000000000012,43,1510.22,8,1929.60000000000012,22,2564.52,22,2791.70000000000032,19,2971.6000000000004
View 3 Replies
Oct 7, 2011
I need to read the second-to-last line of a very large log file.
I can't read the entire thing into memory, count lines, etc etc. I can't use Filestream.Setlength because that needs readwrite access and the log will be opened by another application. And it has to be fast. However, the line ends in a cr/lf meaning the last line is actually empty. Been struggling with this all day, and its hurting my head! Not good on a Friday!
I have a function that can read the last line but can't get it to go one line up. I can get it to read characters from the end of the line, but that's not much of a help with a variable length line!
Maybe fs SeekOrigin would work as it could run backwards looking at an example from MSDN - need to get the data before the last cr/lf and end at the next one... hmmm... problem is that that example also writes the text backwards as well.
I'm monitoring the log for particular entries for issues that are causing us grief at the moment.
View 5 Replies
Jan 30, 2012
My problem is I have very large text files (approx 2GBs+).They have records in them based in one per line.Each line is not the same length and the data can be different lengths all the time.I am currently reading the file line by line, then splitting the data by common characters in the records. To process the full file it currently takes 3hours. This is way too slow for its purpose.
View 4 Replies
Jul 6, 2011
I've a problem reading text file using StreamReader. The file have between 500 000 and 1 000 000 lines.When I try to read it in a cycle, I get an error. That's why I've tried the StreamReader.ReadToEnd method. It worked fine. I've get the entire contents of the file in one string. So far everything is okay, but I've a small problem searching this huge string. I have to reformat the string to my desired format. I'll try to be more specific: The format of the input file is as follows:
50471100 8 2 6 5 0<LF><CR>
00000016 365442 12231<LF><CR>
00000026 112166 31133<LF><CR>
<end>
[Code]...
View 7 Replies
May 1, 2009
Is there anyway to reliably know when a large file has completely finished copying to a particular folder? For example, Computer1 copies a large file to Server1Share1. On Server1Share1, I want to do something AFTER the file is done copying, without Computer1 intervention.
View 2 Replies
Dec 6, 2010
I have a 133 Mb file which contains almost a million records. Currently, a user loads this file into KEdit (a great editor for working with large files) and changes occurrences of a dollar sign $ to blank and takes negative numbers represented as such with parentheses and changes that to a negative sign. That is, (5000.23) would become -5000.23. So the leading ( becomes a - and the trailing ) becomes a blank. I believe the only occurrences of ( and ) are around the numbers I want to change, so I don't have to worry about changing something that should have been left alone. Using VB.NET in Visual Studio 2008, is there a "painless" way to do this other than reading the file one record at a time and searching/replacing and writing the record out? While not really painful, I am worried about how long that will take (to run, not to code ). Is that a valid concern?
My long range goal is to automate many of the routine file preparation tasks my users do. We get an input file from 15 clients. The file can be in any format the client has chosen to give us, and it is our burden to reformat any errant fields into an acceptable format for insertion into our database.
View 2 Replies
Apr 22, 2009
Is there a quick way of finding a character?? I have this error
Invalid character in the given encoding. Line 1, position 7173.
Eak, I need to know what this is but don't really want to count across 7000+ letters
View 2 Replies
Mar 15, 2012
I have a large text file which I want to split into many different items.
On the next file I have a time between each item it's like this
01:20 a.m.
01:44 p.m.
In between these items is information like you would see in a log file.
How can I split these items like this?
View 13 Replies
Jan 30, 2011
I have a large wikipedia dump that I want to cut into different files (1 file for each article). I wrote a VB App to do it for me, but it was quite slow and crapped out after a few hours of cutting. Im currently splitting the file into smaller 50mb chunks using another app but thats taking a long time (20-30 minutes for each chunk). I should be able to cut each of these up individually if I do this.
Does anyone have any suggestions of a way to cut this file up quicker?
View 4 Replies
Oct 8, 2011
I have a large file (2.7GB). I need to split it into smaller files. How to split a large file into smaller files using VB.NET 2003? Cannot use LINQ and the resources (cpu and memory) on operating environment are very limited (it is a shared hosting environment).
View 1 Replies
May 23, 2011
I have created a simple Backgroundworker process to copy a large file (30GB). Is there any way to report the progress of that file copy?
I'm using System.IO.File.Copy to perform the copy. I've seen a few posts/blogs that suggest comparing the bytes copied with the size of the source file but that seems like a huge overhead in this case.
View 2 Replies
Oct 21, 2011
I'm working through an example in a programming book and stumbling across a problem that I can't seem to figure out. The program in question is basically a very basic 'level viewer' for a tile based bitmap game (its a game creation book). The source code from the author was written for Visual Studio 2005, and I am using 2010. The source code imports fine and the program runs correctly.
I built the same program on my own, starting from scratch in VS2010. The program works like this: It opens a tile set palette (bitmap form), then uses an XML file to create a 'level'. The XML file indicates which tile goes where on the 'level'. The XML file is roughly 1.9MB and has roughly16,000 entries (the program creates a level that is 4096 x 4096 pixels using 32 x 32 pixel tiles).
Running and closing the program from source code runs exactly as expected. Running the program from my version runs exactly as expected. Closing it does not. It basically hangs up for about 2 minutes on close. (for reference every loads in about 3 seconds). I've determined that it is the XML file that is causing the problem by excluding the portion where it loads the XML. In that case it closes fine. If I cut it down to just a few records, my program closes fine as well. So it seems like the program is having trouble dumping the XML document at close.
I've tried to track down any differences between the source code and my code, and the only thing that I have found is that the References in mine are all using .NET version 4.0 and the references in the source are using mostly 2.0 and some 3.5 (System.Data.DataSetExtensions, System.XML.Ling.dll, and System.Core.dll)
The code for the entire form is:
Imports System.Xml
Public Class Form1
Public Structure tilemapStruct
[Code].....
View 5 Replies
Aug 16, 2009
[Code] each file is about 30 kilobytes in size, and they contain raw hardware statistical information in a comma-deliminted format. I want to do the following with these files as my end-result.
NEWFILE.CVS contains the contents in this format:
metric_group_001 metric_group_002 metric_group_003 metric_group_004 etc.
I don't want APPEND, I want to concatenate the contents of these files into one large master file. I am able to do this MANUALLY, but I need to have a DYNAMIC method of doing this because of the number of files will change depending on the test we are doing with the hardware. This is the code I am using to do this process manually and it does work, but I need a DYNAMIC method in place. [Code]
View 1 Replies
Aug 30, 2010
I'm trying to browse for a large file and copy it with a background worker but get the folling error:"Current thread must be set to single thread apartment (STA) mode before OLE calls can be made. Ensure that your Main function has STAThreadAttribute marked on it. This exception is only raised if a debugger is attached to the process."
Code:
Private Sub BW_CopyEXE_DoWork(ByVal sender As System.Object, ByVal e As System.ComponentModel.DoWorkEventArgs) Handles BW_CopyEXE.DoWork
Dim fullFilename As String
[code]....
View 10 Replies
Aug 28, 2009
I am using zipforge.net to archive files in a small backup program I am creating. Everything works fine however if I try to archive a large number of files the process of zipping them gets slower the further along in the job it gets.
Example: I have a directory that contains a little over 100,000 files in it. Each file is only about 200k of text. When I use the zipforge zip class it starts zip'ing very quickly but as it gets about 40% of the way through the directory it starts to slow down... at about 60% its working at a snails pace.
Memory usage for my solution also goes up proportionate to the archive file size as it grows. Currently it takes around 26hours to backup this directory which only contains 4 gigs worth of information.
If I use a program like winzip I can zip the folder in less than an hour. I am thinking there has to be a way to write to the end of the zip file without having to open it every time or without keeping it open. My backup software scans the specified folder and writes each file path and name into a text file. Then it reads the text file line by line and passes the path + file info to the zipforge addfile().
View 1 Replies
Dec 10, 2011
I am faced with a rather large text file (200-400 lines)The file displays a lot of data however the problem is that it is not lined up. The data at the moment resembles this
Column1 Column2 Column3 Column4
Bobby Fisher Virginia Rural
Willis Johnson Oklahoma City
[code].....
View 1 Replies
May 9, 2011
I am trying to upload 2 large files to a ftp server by uploading as one file to a location. I have it working if the files are small but it crashes with large files.
For i As Integer = 0 To filelist.Count() - 1
Dim fRequest As FtpWebRequest = WebRequest.Create(ftpPath & "/Reports/" & filelist.Item(i))
fRequest.Credentials = New NetworkCredential(username, psswd)
[code]....
View 1 Replies
May 3, 2010
I'm trying to load folder and file icons, but they're too small to display for large surfaces.
Here is my current (working) code that gets an icon too small:
[Code]...
View 1 Replies
Oct 8, 2011
Imagine there is a very large html file with of course lots of html tags. I cannot load the entire file into memory. My intention is to extract all indexes for this <p> and this </p> strings. How should I achieve it?
View 5 Replies
Apr 3, 2010
How to Use of Backgroundworker while open a large text file
View 3 Replies
Jun 26, 2009
Is there an easy way to insert a line into the very beginning of a large file?I want to insert an XML header line into XML files that my program processes (These files can be very large, so reading it via bitstream and writing it out is not a good option).
File.AppendText makes it very easy to append lines to the end of a file, but I don't see anything for inserting text.I wonder how does say Access work in processing say a 50 meg access database file? There must be a similar process where you can insert data anywhere inside a large file.
View 1 Replies