Possible Duplicate: Memory leak in .net application I am working on a desktop application in VB.net 2005.The application contains a timer with an interval of 1 min. Each time the timer ticks, a set of functions gets executed, mostly database related. Initially the application runs fine. In processes(Task manager) the cpu usage goes to 100% every time the timer is invoked. But the timespan is around 1 sec(negligible). However as the time passes and after around 20 hours the time span of timer_tick increases to something like 20-30 secs. In this period cpu usage is 100% and the application does not responds.Gradually the time span of timer_tick increases to 1 min and the cpu uses gets stuck to 100% and the application does not responds. All objects are properly disposed. Moreover, this issue is with pentium 4 processors. The application runs fine on core 2 duo.I am using DevEx controls in my application.The program runs fine with less records in database.I have run the CLR Profiler. The code seems to be fine.
Microsoft is trying to assure that it loves VB and VB programmers and wants them but in reality the opposite is happening. All code samples either from Microsoft or third party developrs portrait C# code. For the same reason C# programmers are challenging VB Programmers in every field.
In my VB application,I am copying huge amount of data using VB string?This results in performance issue.What shoul I use in place of VB string to improve performance?
I have single-thread windows form application written with VB.NET and targeting Framework 1.1. The software communicates with external boards through a serial interface, and it mainly consist of a state machine that run some tests, driven in a loop done with a Timer and an Interval of 50ms. The feedback on the user interface is done through some custom events raised during the tests.
The problem that is driving me crazy is that the performance slightly decrease over time, and in particular after 1200/1300 test operations. The memory occupied does not increase over time, it is only the CPU that seems interested by this problem. The strange thing is that, targeting framework 2.0 and using the same identical code, I do not have this problem.
I know that is difficult without looking at the code, but do you have suggestions how can I approach the problem? EDIT: I am really lost, after a couple of intensive work the application starts slowing down. The selected row is related to its process, if it could help.
EDIT2: Using the Windows Task Manager I detected that the Handles counter is increased by 1 at the end of each operation. I don't know if it is the cause but the application starts to slow down when the handles counter reaches about 1500 handles. I checked that all necessary RemoveHandler are called after each operation. Any idea?
EDIT3: I found that the handles problem is generated by the C++ library we are using to communicated with the serial device. It then happens both in .NET 1.1 and .NET 2.0. The difference, and that's strange, is that if the target .NET 1.1 the application slow down/freeze instead for .NET 2.0 I reached more than 30000 handles without loosing performances. Now I don't know if the problem is really caused by this lost handles, I will try to ask to the developers of C++ library to correct the problem and see if it solves the problem I am having on .NET 1.1.
I'm using Visual studio 2008 and microsoft sapi in a forms application. I am using text to speech which works just fine however when the computer is speaking it "bricks" the application and the main ui is unresponsive until the computer is done talking then everything returns to normal. To counter this issue I tried using the background worker thread to establish the sapi on another thread however the form is still freezing when the computer speaks. I would like to note that at this point there is nothing on the form except a tabbed window and a few buttons the form is not performing any code what so ever other then the text to speech.
Stupid question, but how do I get access to this Explorer? To create a performance session for Windows client application:(1) Open the solution in the Visual Studio IDE. (2) On the Analyze menu, click Launch Performance Wizard.(3) From the Which of the following available targets would you like toprofile? drop-down list, select the name of the application that you wantto profile, and then click Next. You can add more binaries later
Until now, I have not create any massive applications using ASP.Net. However, I am looking to create an application that has the potential to be very performance intensive. So I am looking for some tools or best practices when it comes to performance. I would like to be able to know how to:
See my current performance (good or bad) View items that need fixing And being able to compare two performance variable items would be great as well.
Currently I'm writing a VB.NET app and it's getting big, resulting in its become very slow.Is there any application (or plug in) that can test the performance in seconds?I mean, when I click a button and it displays a product, I want to know exactly how long it is gonna take.
my friend is currently development an application using SOA Architecture, He sent me a picture with a lot of layers (10 almost) and he is worried about performance issues, the application is development on VB.Net 2.0 & 3.5 (some libs).
I'm trying to weave a .NET performance counter into an application. When I call the incrementBy(value) method on my avg performance counter it is changing the RawValue of my base counter by value as well. I checked the variable names and think everything is correct. Then when I call increment() on my base counter it adds 1 to the rawvalue of the avgcounter as well as incrementing the base counter... adding insult to injury!
In code I'm using two different counters to measure the time a merge sort I wrote takes. I have a instantaneous counter for the elapsed time of the sort and an average counter. Dim timePiece As New Stopwatch() timePiece.Start() MergeSort() timePiece.Stop() [Code] .....
I think I may be using the counters wrong, but why does changing the rawvalue of the average seem to also change the rawvalue of the base? i dont think that's supposed to happen.
I've developed a VB.NET application with Visual Studio 2008. The application communicates with SQL Server and processes a text file.
My question is about performance. While I run it from Visual Studio 2008, it takes 3 sec to complete. The same is when I run the executable created by the Setup Wizard on my desktop (Windows XP sp 3). But if I run the executable installed on a Windows 2003 Server, it takes 15 sec to complete! What could be the reason of degrading performance on the server vs. the desktop? The .Net framework 3.5 SP1 is installed both on the desktop and the server.
Stupid question, but how do I get access to this Explorer? The says: To create a performance session for Windows client application:
(1) Open the solution in the Visual Studio IDE.
(2) On the <B>Analyze</B> menu, click <B>Launch Performance Wizard</B>.
(3) From the <B>Which of the following available targets would you like to profile?</B> drop-down list, select the name of the application that you want to profile, and then click <B>Next</B>. You can add more binaries later.
(4) Accept the default Sampling profiling method, and then click Next.
(5) Click <B>Finish</B>.
Only problem is that there is no <B>Analyze</B> menu in the IDE.
I am running windows vista with 1gb memory and 2ghz proccessor. What would be the best FREE version of .NET to use for best performance? Its been years since I have done any PC programming and I would like to do a little bit =]
The upper picture is the output of a Unix task. The lower picture is what I have in VB. Enumerating tasks is not difficult however the rest it is because it's not documented anywhere.
What am I interested in if I want to get 64-bit performance data?
I'm working on a project that calls for high-performance networking with TLS encryption in VB.NET. The stock SSLStream sucks rotten eggs; its 'asynchronous' read/write operations aren't async at all, meaning that they'll operate parallel to each other and to the main thread but that it offers absolutely nothing that resembles async read/read and write/write, which means that reads are totally synchronous to each other and writes are too (only 1 read and only 1 write can be done at a time).
I've done a lot of reading on this failing and it seems like MS has no intention of ever fixing this.Hoping a whole lot not to have to write my own TLS implementation. I've tried the Mentalis open source SSL suite, which will NOT compile. I've tried SocketWrench tools which will not compile under VS 2010 (and whose documentation indicates it's in fact synchronous on its async threads too; can only do 1 async read and 1 async write at a time).
What I want/need is an SSL/TLS implementation that actually works with DotNET that actually allows me to do multiple asynch reads/writes over a TCP/IP transport stream. Anybody have any experience or recommendations on 3rd party toolkits or alternate methods of SSLStream I/O?
I've developed a .NET application that, among other things, does the following:Uses WebClient to retrieve data from a remote server. Serves as a socket server to 2 'satellite' applications run on the same machine or on a LAN.When I run the app in the VS IDE, it works great. It quickly gets the data from the remote server and communicates perfectly with the 2 satellites.However, when I build it and run it as an EXE, the response from the remote server is very slow and its communication with the 2 satellite applications become very poor.Is there some important difference between running an app in the IDE and running it as an EXE that could effect it like this?
I'm overloading a vb.net search procedure which queries a SQL database.One of the older methods i'm using as a comparison uses a Stored Procedure to perform the search and return the query.My new method uses linq.I'm slightly concerned about the performance when using contains queries with linq. I'm looking at equally comparable queries using both methods.Basically having 1 where clause to Here are some profiler results;
Where name = "ber10rrt1" Linq query : 24reads Stored query : 111reads
[code]....
Forgetting for a moment, the indexes (not my database)... The fact of the matter is that both methods are fundamentally performing the same query (albeit the stored procedure does reference a view for [some] of the tables).Is this consitent with other peoples experiance of linq to sql? Also, interestingly enough;Using like "BER10%"
I have this code to ping a list of over 400 switches , the program works fine but for some reason the first time i ping all the switches alot of the attempts time-out(even tho they shouldnt) which makes the program alot slower , if i click ping again less of the attempts time-out(and as a result the program runs faster) and if i click it a third time i usually get a completely accurate result with all the switches pinging back "success" (except for a few which i know for a fact are down anyway) any ideas why this is ?
When writing an If statement, I've always used And when needed like: If 1=1 And 2=2 Then. The only time I ever used AndAlso is if the second condition will error if the first isnt true like: If Not IsDbNull(Value) AndAlso Value=2 Then. However, recently I've heard that AndAlso is better for performance than And as the second condition is only read when the first is true. In this case, should I always just use AndAlso?
These 2 ways of working both work, but I'm wondering if there's a difference in performance:
Dim collection As ItemCollection = CType(CellCollection.Where(Function(i) i.IsPending = True), ItemCollection) For Each item As Item In collection 'Do something here Next
and
For Each item As Item In CellCollection.Where(Function(i) i.IsPending = True) 'Do something here Next
I thought the second one was better as you'd have a variable less and looks cleaner, but on second thought, I'm not quite sure what happens when you put a linq query in the iteration.
I'm sorting a collection of type "Document" (usually around 100k records). Sorting usually takes around 4-5 seconds, and I'm wondering if there's a way to speed up sorting by modifying my "DocumentComparer" class which implements IComparer(Of Document). Since the Compare() method would be called hundreds of thousands of times, are there any performance improvements that could be made here that I've overlooked?
I have this unusual problem with mailing from my app. At first it wasn't working (getting unable to relay error crap) anyways I added the proper authentication and it works. My problem now is, if I try to send around 300 emails (each with a 500k attachment) the app starts hanging around 95% thru the process.
Here is some of my code which is called for each mail to be sent
I have the mission to make a small game for a school project. Pictures boxes, moved by a timer for walking enemies.If there are around 5 or 6 moving picture boxes at the form, my application get troubles and lags.After I kill some enemies (remove them from the Controls Collection of the Form/Panel) It come back smooth.I think the loop of the enemy movement is too complicated but I don't know how to make that simpler.
I'm creating a password hash recovery tool that uses brute force. I have the brute force algorithm run in a background worker. It all works, but some similar apps I tried out have performance ~100x bigger then my app. (mine does 50k keys/second, some others do 5m keys/second or more.)I realize this is partly caused by using a .Net language, but I suspect there are ways to significantly speed up the brute force. A lot of ppl say I should use multi threading, but how is this done in practice? Splitting up the passwords to check would slow down my app I think.
I have a few Oracle procedures that generate/return a large amount of data that I need to write out to a file.I'm currently trying to accomplish with a data-reader. It seems to be working, I've successfully generated a 479mb file without any trouble.It took less than 4 minutes from the time I retrieved the dataReader to complete the file.But the dataReader I get for a particular procedure is crawling.It's unbelievably slow.I modified my code to try and get a better idea of what is going on[code]I'm writing a PL/SQL block that will open that cursor to see if the performance issue exists when I remove the .Net code..
I am dealing with a situation that I need some help with here. I need to improve performance on functionality that records and updates UI with user selection info. What my code current does is
'This is called to update the Database each time the user 'makes a new selection on the UI
Private Sub OnFilterChanged(String newReviewValueToAdd) AddRecentViewToDB(newReviewValueToAdd) UpdateRecentViewsUI() PageReviewGrid.Rebind()'Call Grid Rebind End Sub
So, I've been reading around on the internet and through MSDN trying to find a way to add thousands of items to a combobox without taking 12 seconds.I KNOW, I'm going to incite reflex-responses involving: bad design; use categories and filter; use BeginUpdate ...Here's the thing.I've read through a lot of similiar forum questions as the one I'm asking and none really seemed to answer the question, but these responses are common.The problem is, there is no way to further categorize.I have thousands of customers and each customer can have anywhere from 1 to tens of thousands of designs.The designs have to IDs: DesignID and DesignTitle. The data entry people sometimes have the one and sometimes have the other. If they have the DesignID than all they've got is a number ranging from 1 to (at this time) 60,000. One Customer has 11000 designs. I don't know of any way to further categorize.The reason I use a combobox is for autocomplete. Also, with most customers there are only 10-20 designs, but for some there are thousands.I have to have one practical solution that handles all possibilities.
As far as BeginUpdate or SuspendLayout is concerned, neither have any effect.The sad thing is, the DataSource takes less than a second to load from my DataBase.It's just the setting of the datasource for the combobox that takes so long.It seems to iterate through the entire list once for the bindingsource and then again when I set the DisplayMember.I can set the display member before I set the binding source but that doesn't make any difference in terms of performance.Performance is improved by almost half when I don't use a BindingSource and just set the Combo's DataSource to the List directly, but, sadly, I need to use the bindingSource because there are multiple Combos using the same list.I know WPF uses virtualization for it's combos,.On the internet I found a Technical document on how to implement a Virtual Combobox in C#, but the code was incomplete and I don't really have the time to spend several days creating and debugging something like that.
In the following block of code always returns zero for the CPU usage. However, if I run it through Visual Studio in debugging mode, do a "Quick watch" on the variable "pcCPUCounter", add ".NextValue()" at the top, and tell it to reevaluate, that returns the varying percentage (eg, 5%, 71%, 16%, etc etc) as the processor utilization fluctuates.
Why would the code always print out zero, but the quick watch doesn't?
Private Sub UpdateCPUUsage(ByVal strSelectedServer As String) 'Performance items come Performance Monitor: perfmon.msc. 'The right-click on columns at bottom and select "add counter" to see list.