Gargage Collector and VO->X#-Conversion

Public support forum for peer to peer support with related to the Visual Objects and Vulcan.NET products
User avatar
Phil Hepburn
Posts: 743
Joined: Sun Sep 11, 2016 2:16 pm

Gargage Collector and VO->X#-Conversion

Post by Phil Hepburn »

Hi Nick,

Since you are another guy who has had an enormous influence on keeping me up to date with ICT and Microsoft's coding technology, I thought you may like to see the attached images.

As you can see, you are a Customer in my test data - that being tested with 'LINQ to SQL' and the 'Code First' approach. Doing well so far, hardly any/many errors now, as I develop.

I have gotten to a point where I can write useful and meaningful LINQ query code - that attached gets the products for a single specified customer - since you are customer with ID 5, then these are your products - those in your account. Sorry they are not much more interesting than boxes and packaging ;-0) But thought you and the world would like to see what you purchase and use ;-0))

I am now to add the grouping to the query code, so that when you have more orders the product items only appear once in the list, as well as having their total purchase number found and shown.

Back to your big apps, .... it would seem to me that even though you and others may place the code at a distance, we still need to apply newer and modern coding techniques to aid removal of coding errors, in anything we write. So unit testing for some is, and will be, very important.

For the local UK guys selling their business (my last post) their business is VERY technical, analytical, and mathematic, so TESTING was a big must for them. They need to show prospective buyers that the code is robust and solid - and correct.
NFproducts_01.jpg
NFproducts_01.jpg (50.56 KiB) Viewed 519 times

NFproducts_02.jpg
NFproducts_02.jpg (101.7 KiB) Viewed 519 times
Speak soon,
Phil.
P.S. got a stinky head cold, can't think too clearly !
NickFriend
Posts: 248
Joined: Fri Oct 14, 2016 7:09 am

Gargage Collector and VO->X#-Conversion

Post by NickFriend »

Hi Phil,

Nice to see you working hard!.... and don't talk about colds, I've had a series of them over the last few weeks. Also, if those are boxes I've bought from you, they're bloody expensive!

More seriously, I understand why you're working through the LINQ to SQL stuff, but looking at your example, I think it highlights where the strengths of Entity Framework are. As you know I hate SQL, and one of the reasons is that it quickly becomes unreadable, and this is reflected in how the LINQ to SQL works as well. Looking at your example (which is well beyond my own capabilities of handling LINQ) it's not at all clear WHAT the code is doing. The same would be true for the raw SQL statement. You'd have to sit down and methodically work through the statement to be able to tell what was going on. This makes code unmaintainable.

The same thing using LINQ to Entities would almost certainly be understandable more or less immediately. I understand that LINQ to Entities has it's limitations, but you can always revert to raw SQL through DBContext.SQLCommand for exceptional circumstances.

So my vote is that you ditch the LINQ to SQL and move straight on to LINQ to Entities, because I think it's much more useful to XBase programmers, and of course with Vulcan it wasn't an option.

Great work, and nice to see you turning your talents to X#.

Nick
User avatar
Phil Hepburn
Posts: 743
Joined: Sun Sep 11, 2016 2:16 pm

Gargage Collector and VO->X#-Conversion

Post by Phil Hepburn »

Hi Nick,

Yes, I understand your concerns ;-0) But in fact the 'query syntax' is not as awful as it first looks. Once you know how to read it, and you get your eye in.

One reason for doing some L2S is that we can use existing data Tables in our own, or commonly used database - quite easily. That is NOT using 'Code First'. The mapping is relatively easy.

And if we use existing DBs then we do need to know how to JOIN tables, and how to GROUP rows. And a bit more as well.

In 'L2S' the mapping is much more simple than in 'L2E'. Remember this technology is not just LINQ query statements to get back data. Its other important bits as well.

When we move to L2E we really are best to start a new database, this means we need to be able to do our own mapping - so handy if we practise a bit with L2S.

Yes, I like the fact that we no longer have to JOIN tables a lot (regularly) as the L2E handles all that sort of thing by allowing us to map into the entities, things like a Customer class with a property of a collection of Invoices, and also a collection of Products used. The collection of Invoices can have a deeper property within each invoice which is a collection of invoice 'Item Lines'.

L2S is a simpler way into using collections for everything vaguely like data.

L2s or L2E, its swings and roundabouts - BUT - sorry, I can't make things magically work for guys like you who are confessed 'SQL haters'. If we know some SQL then stuff makes a lot more sense, and life gets easier.

NO, don't do it - I would STRONGLY advise anyone against sending SQL code straight through to the SQL engine from the app, as text strings - you miss the whole point of L2S and L2E - where the .NET code is checked by the compiler and the data types used are all .NET - no T-SQL required. No, NO, NO !!!

I came from ADO.NET and was very happy when I could scrap the 'straight through' query approach.

Okay Nick - L2E it is then, when I come back next week from a family trip to the North East of England.

Oh! - I was also just seeing (in my research) if X# was as good as I had heard, and that LINQ and stuff just worked well - as indeed it does, well done Roslyn. I also was getting to know the details of the LINQ syntax - which I am sharing with all X# colleagues.

Oh! and if I do a couple of session at the conference then I need to do it all - can't miss bits out in the middle as a rush to modernity.

Hope this makes sense to you.
Cheers,
Phil.

P.S. whatever your smart postings, you are really just a DBF man at heart - rows and columns until the cows come home ;-0)
NickFriend
Posts: 248
Joined: Fri Oct 14, 2016 7:09 am

Gargage Collector and VO->X#-Conversion

Post by NickFriend »

Hi Phil,

I do agree about sending raw SQL. However where Entity Framework is weak is on some specific things like bulk updates. So for those very specific cases it becomes necessary to use raw SQL just to keep performance acceptable.

But for 99%, yes, compiler-checked code through L2E.

I actually enjoy using a db like SQL Server now, and hate it when I have to go back to maintaining old DBF based VO apps. Virtually all our data handling now is done by getting data out of the db (with L2E) and into objects in memory, carrying out all the processing in memory, then writing any updated info back to the db when finished. It's so much more flexible than being tied to the DB structure as you tend to be when working directly over DBF files.

Nick
User avatar
Phil Hepburn
Posts: 743
Joined: Sun Sep 11, 2016 2:16 pm

Gargage Collector and VO->X#-Conversion

Post by Phil Hepburn »

Okay Nick,

We seem to be in-line with our thinking on these matters ;-0)

I also buy 'bulk updates' and similar stuff via the RAW SQL statements.

Yes, I have gotten VERY comfortable with working with objects, 'Business Objects' all of the time, and also collections of them too.

After working on L2E when I return from the North East, I will start to turn my mind to looking seriously at Generics in X#. I seem to remember you doing a session or two in Gloucester last year.

At the moment I am just delighted to see what can be done in X# with LINQ and L2S and hopefully with L2E. It was almost worth the six year wait ;-0((

Speak soon,
Phil.

P.S. yes, my 'box and package' prices are a bit pricey - maybe while they are not selling !?
User avatar
Phil Hepburn
Posts: 743
Joined: Sun Sep 11, 2016 2:16 pm

Gargage Collector and VO->X#-Conversion

Post by Phil Hepburn »

Hi Nick,

Just a thought .... so other guys get the picture ....
In the same way as 'RAW' SQL can be sent from our apps to the DB engine, form both L2S and L2E, we need to know that we are just using mappings from our object classes to a set of disconnected Tables in the SQL database.

So to my mind we could use both technologies from our apps - yes, both L2S and L2E. Maybe a set of bulk updates can be done more efficiently / quickly via an L2S mapping, and at the same time have L2E for other parts of the app.

When you talk of the RAW SQL being sent directly to the DB, what exactly are we talking about ? It can't be difficult 'Entity' stuff as this would make the SQL statement required DIFFICULT in the extreme.

Has you ever thought of carefully mixing the two technologies ? It is OK for your case since the SQL Server is a Microsoft one, and I believe that the limitation is still in place that L2S only works with MSSS, unlike L2E which does all SQL flavours.

ALSO, guys need to know that even with L2S and L2E we can have multiple mappings - some simpler than others, and just use them one at a time. Switching back and forth - this is all possible because of the disconnected data model of the SQL database engine - works one SQL statement / query at a time.

Because we are then doing everything in .NET objects (and collection objects) we can supply the same collections to one or the other, L2S, L2E, and even different versions of these as well. Okay, we may have to be a bit careful at times doing this BUT there is no reason why we can't.

This morning I will try and mix two or three different L2S mappings in the one app. Basically we define two or mote 'DataContext' sub classes - each which can be used separately, but more or less at the same time.

Any thoughts and ideas ? And please explain your BULK process requirements.

All the Best,
Phil.

P.S. I remember using Bulk Copy and DataTables with ADO.NET to do some pretty amazing transfers - VERY quick but now seems 'old hat'.
NickFriend
Posts: 248
Joined: Fri Oct 14, 2016 7:09 am

Gargage Collector and VO->X#-Conversion

Post by NickFriend »

Hi Phil,

The thing about EF is that basically it works on a single entity at a time, identifying each one by it's primary key.

So the normal way to update a single entity would be (in pseudo-code)

Code: Select all

dbcontext.Set<Product>().Add(myupdatedproductobject);
dbcontext.SaveChanges();
So EF identifies the primary key in the object, and generates SQL something like "UPDATE Product ..... WHERE ProductID = myobjectid".

You can also send in a list of objects...

Code: Select all

dbcontext.Set<Product>().AddRange(myupdatedproductobjectslist);
dbcontext.SaveChanges();
That starts to look like a bulk update, but I think that all is happening is that it's creating the same SQL as before, but for each item in the list... so if you have 100 objects in the list, you'll get 100 update statements.

So if what you were actually doing is for example, adding £5 to the price of a range of products, then you'll be much better doing something like

Code: Select all

dbcontext.Database.ExecuteSqlCommand("UPDATE Products SET Price=Price+5 WHERE ProductType=XXX;");
I also had a situation where we occasionally need to copy a large number of records from one table to another... I tried it first with straight EF, and it might take say 1 minute. With raw SQL it was 2 seconds.

And finally there are some things that EF just can't really do...

Code: Select all

dbcontext.Database.ExecuteSqlCommand("DELETE FROM [Product]; DBCC CHECKIDENT ('Product', RESEED, 0);");
Nick
User avatar
lumberjack
Posts: 727
Joined: Fri Sep 25, 2015 3:11 pm
Location: South Africa

Gargage Collector and VO->X#-Conversion

Post by lumberjack »

Phil,

Although I think there is merit in the logic of Linq and I see there are work done by the PostgreSQL development group on the Npgsql .net assembly for PG to incorporate Linq, there are some concerns I have due to different database flavours.

Lets look at inserting two rows into a table. We will construct a RAW sql statement to do the job:
"INSERT INTO TABLE1 (FIELD1, FIELD2) VALUES 'A', 'AA'"
"INSERT INTO TABLE1 (FIELD1, FIELD2) VALUES 'B', 'BB'"

Now if we assume the table contains a FIELD0 that is a sequence (auto-number) we need to retrieve it via another select statement to get the last inserted max(field0) and maybe also have to do a select to retrieve some auto-populated FIELD3-XXX from the table and display it in our view.

What is a concern to me, how will Linq know that my specific RDBMS can do all this in 1 command giving me over a network a 20x better response?

For example in PG:
"INSERT INTO TABLE1 (FIELD1, FIELD2) VALUES ('A', 'AA'), ('B', 'BB')"

Now it even gets more interesting:
"INSERT INTO TABLE1 (FIELD1, FIELD2) VALUES ('A', 'AA'), ('B', 'BB') RETURNING *"

I can use a DataReader on the above and in 1 statement I can insert the rows and have the new inserted rows retrieved with a single network hit, which we know is the slowest part of any software system.

So in general, yes I do think there is a place for Linq, even if it is just to highlight the issues with SQL syntax, but I just don't think Linq is the alpha and omega solution for all data interchanges.

Just my 2c.

LJ
______________________
Johan Nel
Boshof, South Africa
NickFriend
Posts: 248
Joined: Fri Oct 14, 2016 7:09 am

Gargage Collector and VO->X#-Conversion

Post by NickFriend »

Hi Jack,

I agree 100% about the efficiency of L2E if working with a remote database (network or Internet server) - I think it would be pretty disastrous in those circumstances because of the amount of traffic it can generate.

However this comes back to Arne's earlier comment about working with remote services and thin clients. How we use it is from our server modules direct to the SQL database, which are invariably on the same machine. The client machines simply work with in-memory objects - when the physical database needs updating, the object is sent back to the server, and the server writes the data into the db residing on the same physical machine, using L2E.

Because, as Phil mentioned, one of the great things about L2E (apart from not having to learn much SQL syntax!) is that of course it's all compile-time checked.

But yes, if you were making direct calls from a client machine to a remote db, L2E could be disastrous for performance.

And re. different flavours of database, the idea that EF can really abstract out to the level that you can work with different dbs transparently, quite frankly it just doesn't work, precisely for the reasons you say - different dbs have different options for optimising processes. For example, we use SQL Server RowVersion columns to simplify concurrency control, and this info needs to be round-tripped to the clients, so our clients are really bound to SQL Server even though we work exclusively through EF and L2E.

Nick
User avatar
Phil Hepburn
Posts: 743
Joined: Sun Sep 11, 2016 2:16 pm

Gargage Collector and VO->X#-Conversion

Post by Phil Hepburn »

Hi Jack,

LINQ to Entities will work with a range of different SQL 'flavours', unlike LINQ to SQL (I believe). PG should be there if you look, in the way that other Microsoft stuff links to different back-end databases.

So the LINQ code syntax in .NET (C# and X#) will create correct PG SQL statements - it is quite amazing what it does.

The point about this whole 'ORM' approach is to allow the developer to move away from data as cells, rows and columns (and even Tables) and see data as .NET objects, business objects. So his/her code describes more the problem being solved than telling the computer 'bits' what to do.

There is no right or wrong way - BUT - having done them all, including a LOT of direct SQL querying and reporting, I think LINQ is definitely the way to go. I have gotten to LOVE it ! Took a while though ;-0)

Also, don't forget that LINQ is the modern .NET way of handling data collections in memory - 'LINQ to Objects' has nothing to do with a SQL back-end server, as such.

If you wish to see the SQL statement that LINQ creates (behind the scenes) then it is easy to display it, and copy and paste it into Management Studio, or your own proprietary IDE to see it in action.

My personal view (after MANY hours, days, months, and years work) is that LINQ is the way to go generally.

But there again, we still have many friends and colleagues who are not even convinced about SQL, and love their DBFs. There is nothing wrong with rows and columns, but then we still like steam trains ;-0) Puff, puff !!!

I will continue with my research to make LINQ technologies available to all X# guys, and provide working syntax for all of the parts and areas of LINQ, Objects, SQL, and Entities too.

Even when I used ADO.NET 7 and 8 years ago, we had moved away from 'DataReader' technology and were getting 'DataTable' compatible results sets - it was so easy to find and display bunches of data that way.

And we won't go into discussing WPF and WinForms ;-0((

Kind regards,
Phil.
Post Reply