Skip Navigation
Anyone having acceptable performance with SQL Server + odbc?
  • 2100 parameters is a documented ODBC limitation( which applies on all statements in a batch)

    This means that a

    "insert into (c1, c2) values (?,?), (?,?)..." can only have 2100 bound parameters, and has nothing to do with code, and even less that surrounding code is "spaghetti"

    The tables ARE normalised, the fact that there are 50 colums is because underlying market - data calibration functions expects dozens of parameters, and returns back dozens of other results, such as volatility, implied durations, forward duration and more

    The amount of immaturity, inexperience, and ignorance coming from 2 people here is astounding

    Blocked

  • Anyone having acceptable performance with SQL Server + odbc?
  • I timed the transaction and opening of the connection, it takes maybe a 100 milliseconds, absolutely doesn't explain ghe abysmal performance

    Transaction is needed because 2 tables are touched, i don't want to deal with partially inserted data

    Cannot share the code, but it's python calling .NET through "clr", and using SqlBulkCopy

    What do you suggest i shouldn't be using that? It's either a prepared query, with thousands of parameters, or a plain text string with parameters inside (which admittedly, i didn't try, might be faster lol)

  • Anyone having acceptable performance with SQL Server + odbc?
  • I will try bcp. Somehow, i was convinced I had to have access to the machine running the sql server to use it, but from the doca i see i can specify a remote host.. Will report back! EDIT: I can't install bcp because it is only distributed with SQLServer itself, and I cannot install it on my corporate laptop.

  • Anyone having acceptable performance with SQL Server + odbc?

    Omg it's sooo daammmn slooow it takes around 30 seconds to bulk - insert 15000 rows

    Disabling indices doesn't help. Database log is at SIMPLE. My table is 50 columns wide, and from what i understand the main reason is the stupid limit of 2100 parameters in query in ODBC driver. I am using the . NET SqlBulkCopy. I only open the connection + transaction once per ~15000 inserts

    I have 50 millions rows to insert, it takes literally days, please send help, i can fucking write with a pen and paper faster than damned Microsoft driver inserts rows

    23
    parquet vs csv
  • I would not recommend using parquet instead of csv. Indeed, parquet is a type of wooden flooring, while csv is a human readable file format. As you can see, it is not wise to replace one with the other. Don't hesitate about asking more questions regarding your home design!

  • Using Ocaml to learn FP. What FP concepts should I touch on?
  • I would be lying if I said that I was a professional Ocaml developer.

    I do definitely think the Jane's library is very hard to read... But that might be my inexperience

    I personally think regular, procedural code is much easier to read ¯_(ツ)_/¯

  • Advice on where to begin with GUI programming?
  • Sorry to that guy, but Python is a terrible choice for GUI. It simply doesn't even exist in the wild, except in newbies' minds. You should pick something that allows you to* easily & immediately* reload your GUI in progress without restarting the program; AND also maybe think about how to distribute your GUI on other computers without requiring them to duplicate your python setup

  • Stop using pandas in web development

    Are you doing data science? Statistics? No?

    Then for god's sake don't use pandas, you just look dumb af when you pull several MB of a package just to load csv. If you find yourself doing that, just stop programming and look for another job

    Thanks for attention

    7
    InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)KS
    kSPvhmTOlwvMd7Y7E @programming.dev
    Posts 2
    Comments 31