-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Read data as array #770
Comments
No, currently this is not supported - see also #658 |
This is also something we'd be interested in. When changing some other parts of our pipeline from objects to arrays we got about a 65% reduction in time when loading / processing / storing data. Primary reasons being that objects generate a lot more garbage, are much larger when converted to JSON, and arrays are much more efficient to iterate over - not all of these apply directly to this module, but I would expect we'd have a pretty measurable performance boost, certainly in our app-side code handling the results, and hopefully also in the node-mysql-side code if it has to do a lot less work marshaling data around. |
It would indeed improve performance here, especially because it would remove the need for the module to look up the field name for each column in each row and add a new property to the object for each column (and in fact, since we know how many fields come back, we would pre-allocate each row's array with the necessary number of elements, removing any potential array re-allocation. |
This would be a great feature. Node-pg has this, it's called a config called rowMode='array' |
PRs to implement are welcome and will speed up the time-to-release significantly :)! |
Kinda like Subscribing because interested. |
Agree that this is highly desirable and has to be more efficient the more data there is. |
Hi @tomcon, you're welcome to work on a pull request to implement this feature! We would love it! |
+1 EDIT: Have now converted some resultset from Object to Array and one resultset actually reduced from 7151 bytes to 3459 bytes. More than 50% reduction in file size. |
Hi @Stavanger75, you're welcome to work on a pull request to implement this feature! We would love it! |
Hi @dougwilson, thanks for the invite. It looks like @ifsnow have allready completed the task. Thats great :-) |
Indeed, @ifsnow submitted it a few days after my invite :) |
is there any update on this option? |
For anyone reading this and waiting, you can use the mostly compatible mysql2 library in the meantime: https://github.com/sidorares/node-mysql2/blob/master/documentation/Extras.md#receiving-rows-as-array-of-columns-instead-of-hash-with-column-name-as-key |
Is there option for getting records from mysql as flat array rather than object. Something like this
{ contractor_id: 199 , contractor_name: ... }
represented as[199,"Andrew","Banks", 88.92, 55,15]
.I want to speed up reading of 75k records, now it takes 35 seconds I am wandering is array representation would be transferred to node faster as 35 seconds.
The text was updated successfully, but these errors were encountered: