I have a dataset in a Visual Studio 2010 Web App project which accesses the DB with a complex SQL statement. If I run the statement in SQL Management Studio directly, it loads in a less than a second. If however, I run it using the "Preview Data" button in the dataset designer, or I try to access it on a page (with a gridview for example), it takes over 40 seconds!
I have a program in where I need to loop though the entire dataset.datatable. I takes a long time. What is the alternative I can use to do the some job but faster?
I'm currently working with a gridview that has a huge underlying dataset.I'm almost certain that I saw something about a new facility in Visual Studio 2010/.NET 4, which would improve the performance (e.g. doesn't download all the data on databind, but rather only what data required for current page). My guess would be that it was using AJAX.I didn't need it at the time I saw it, and now I can't find it, even though I've been hunting online for a while.
Simple question that does not seem to be covered: If I use a lot of Debug.WriteLine statements in my code, will they be completely absent in my production version?
I mean: Is the compiler smart enough to not emit any code for those calls? Or would I have to surround them by #if DEBUG..#end if directives?
It run smoothly for the first time,but when I execute it for the second time,it occured the following errors:Unable to copy file "obj\Debug\FYP.exe" to "bin\Debug\FYP.exe". The process cannot access the file 'bin\Debug\FYP.exe' because it is being used by another process.
When the user clicks an "Edit" button on my form, I want a box to come up which allows the user to edit a DataTable in a strongly-typed DataSet. What's the best way to do this?
I have code running in the Datatable.ColumnChanging event in my dataset. This dataset underlies a form and conventional drag/drop controls are in place for data entry.when the event triggers and runs, I am running code in the form that checks the dataset.HasChanges property. It is showing False. But this is immediately after the ColumnChanging event has been triggered.Okay, I see by others posts and MSDN that .HasChanges will only be true after moving off the row with the changed column. I have also noted lots of discussion about the advanced binding property of DataSourceUpdate Mode, but that does not address this issue.I guess I can do this by checking the state of the row for the binding source. Just seems odd that the event behind the dataset can be triggered and that does not change the dataset.HasChanges property.
I have tried everything I can to get beyond this error which shows below as <<<<< error here. It is trying to fill a dataset from a data adapter. If I change the SELECT statement to just SELECT * FROM xTable I get the correct number of records in each table. But anytime I try with a more complex statement I get the error message shown below which indicates System.Data.Common.DbDataAdapter.Fill(DataSet dataSet. I've completely erased all data and entered a new set of test data so I know there is no problem with relationships. Each table has primary key which is foreign key in other table. IS there something wrong with the Imports section: Imports System
i am trying to copy data from a standard Dataset to a Type Dataset (XSD) of same table structure. i want to use Automapper to do that one. So how can i do that using automapper?
I have two Datagrids, One grid has all the customers garments on it with style number and contact length. The other grid has the users who have garment issued to them. the style number is in both grids. I need to loop through the users grid and say if the contract number is 1 from the first grid then the contract date on the second grid will be todays date + 365 days (year contract) I have looked at using a stored procedure and also a for each command but I am just getting stuck with it all. [Code]
I have a problem saving a dataset which contains rows that i have imported from another dataset. i can successfully view the imported rows in a gridview but i cannot commit the rows back to the database.
I am running windows vista with 1gb memory and 2ghz proccessor. What would be the best FREE version of .NET to use for best performance? Its been years since I have done any PC programming and I would like to do a little bit =]
The upper picture is the output of a Unix task. The lower picture is what I have in VB. Enumerating tasks is not difficult however the rest it is because it's not documented anywhere.
What am I interested in if I want to get 64-bit performance data?
I'm working on a project that calls for high-performance networking with TLS encryption in VB.NET. The stock SSLStream sucks rotten eggs; its 'asynchronous' read/write operations aren't async at all, meaning that they'll operate parallel to each other and to the main thread but that it offers absolutely nothing that resembles async read/read and write/write, which means that reads are totally synchronous to each other and writes are too (only 1 read and only 1 write can be done at a time).
I've done a lot of reading on this failing and it seems like MS has no intention of ever fixing this.Hoping a whole lot not to have to write my own TLS implementation. I've tried the Mentalis open source SSL suite, which will NOT compile. I've tried SocketWrench tools which will not compile under VS 2010 (and whose documentation indicates it's in fact synchronous on its async threads too; can only do 1 async read and 1 async write at a time).
What I want/need is an SSL/TLS implementation that actually works with DotNET that actually allows me to do multiple asynch reads/writes over a TCP/IP transport stream. Anybody have any experience or recommendations on 3rd party toolkits or alternate methods of SSLStream I/O?
I've developed a .NET application that, among other things, does the following:Uses WebClient to retrieve data from a remote server. Serves as a socket server to 2 'satellite' applications run on the same machine or on a LAN.When I run the app in the VS IDE, it works great. It quickly gets the data from the remote server and communicates perfectly with the 2 satellites.However, when I build it and run it as an EXE, the response from the remote server is very slow and its communication with the 2 satellite applications become very poor.Is there some important difference between running an app in the IDE and running it as an EXE that could effect it like this?
I'm overloading a vb.net search procedure which queries a SQL database.One of the older methods i'm using as a comparison uses a Stored Procedure to perform the search and return the query.My new method uses linq.I'm slightly concerned about the performance when using contains queries with linq. I'm looking at equally comparable queries using both methods.Basically having 1 where clause to Here are some profiler results;
Where name = "ber10rrt1" Linq query : 24reads Stored query : 111reads
[code]....
Forgetting for a moment, the indexes (not my database)... The fact of the matter is that both methods are fundamentally performing the same query (albeit the stored procedure does reference a view for [some] of the tables).Is this consitent with other peoples experiance of linq to sql? Also, interestingly enough;Using like "BER10%"
I have this code to ping a list of over 400 switches , the program works fine but for some reason the first time i ping all the switches alot of the attempts time-out(even tho they shouldnt) which makes the program alot slower , if i click ping again less of the attempts time-out(and as a result the program runs faster) and if i click it a third time i usually get a completely accurate result with all the switches pinging back "success" (except for a few which i know for a fact are down anyway) any ideas why this is ?
When writing an If statement, I've always used And when needed like: If 1=1 And 2=2 Then. The only time I ever used AndAlso is if the second condition will error if the first isnt true like: If Not IsDbNull(Value) AndAlso Value=2 Then. However, recently I've heard that AndAlso is better for performance than And as the second condition is only read when the first is true. In this case, should I always just use AndAlso?
I have multiple datasets that I would like to combine into one. There is a common ID field that can be associated to each row. Calling Merge on the dataset will add additional rows to the dataset, but I would like to combine the additional columns. There are too many fields to do this in one query and therefore would make it unmanageable. Each individual query would be able to handle ordering to ensure the data is placed in the correct row.
For Example lets say I have two queries resulting in two datasets:
ok, I've finally got my test database issues sorted out, however now I need to actually modify my existing report to add the two new fields from my test database. I have made a copy of the report so that I am not modifying the original. I have deleted the two old fields and now want to add my two new fields in there. In the Field explorer window, I see an object called "dataset1" and when I expand it, it is using a view that includes fields from the table that I added my two new fields to.
The problem I see here is that I have gone in and modified the view to include these two new fields in my test database - but again, they are not showing up as selectable fields here. I am just assuming that "dataset1" is somehow pointing to the live database again and that I need to change the datasource location.In my solution explorer, I see an object called "dataset1.xsd" but the properties of that do not tell me much about where it is actually pointing to...
These 2 ways of working both work, but I'm wondering if there's a difference in performance:
Dim collection As ItemCollection = CType(CellCollection.Where(Function(i) i.IsPending = True), ItemCollection) For Each item As Item In collection 'Do something here Next
and
For Each item As Item In CellCollection.Where(Function(i) i.IsPending = True) 'Do something here Next
I thought the second one was better as you'd have a variable less and looks cleaner, but on second thought, I'm not quite sure what happens when you put a linq query in the iteration.
I'm sorting a collection of type "Document" (usually around 100k records). Sorting usually takes around 4-5 seconds, and I'm wondering if there's a way to speed up sorting by modifying my "DocumentComparer" class which implements IComparer(Of Document). Since the Compare() method would be called hundreds of thousands of times, are there any performance improvements that could be made here that I've overlooked?
I have this unusual problem with mailing from my app. At first it wasn't working (getting unable to relay error crap) anyways I added the proper authentication and it works. My problem now is, if I try to send around 300 emails (each with a 500k attachment) the app starts hanging around 95% thru the process.
Here is some of my code which is called for each mail to be sent
I have the mission to make a small game for a school project. Pictures boxes, moved by a timer for walking enemies.If there are around 5 or 6 moving picture boxes at the form, my application get troubles and lags.After I kill some enemies (remove them from the Controls Collection of the Form/Panel) It come back smooth.I think the loop of the enemy movement is too complicated but I don't know how to make that simpler.
I'm creating a password hash recovery tool that uses brute force. I have the brute force algorithm run in a background worker. It all works, but some similar apps I tried out have performance ~100x bigger then my app. (mine does 50k keys/second, some others do 5m keys/second or more.)I realize this is partly caused by using a .Net language, but I suspect there are ways to significantly speed up the brute force. A lot of ppl say I should use multi threading, but how is this done in practice? Splitting up the passwords to check would slow down my app I think.