Skip to content

Conversation

@jkingyens
Copy link

use a node.js array to queue up pending callbacks when requests are made before the token is fetched. The current mechanism uses an event emitter to add listeners, which can cause warnings when there is a buildup of requests during the initialization phase:

node) warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit.
Trace
at Connection.addListener (events.js:160:15)
at Connection.once (events.js:185:8)
at Connection.createAuthorizedReq (/Users/jkingyens/reach_system/node_modules/gcloud/lib/common/connection.js:272:8)
at Connection.req (/Users/jkingyens/reach_system/node_modules/gcloud/lib/common/connection.js:230:8)
at Bucket.makeReq_ (/Users/jkingyens/reach_system/node_modules/gcloud/lib/storage/index.js:572:19)
at Bucket.stat (/Users/jkingyens/reach_system/node_modules/gcloud/lib/storage/index.js:269:8)
at Bucket.createReadStream (/Users/jkingyens/reach_system/node_modules/gcloud/lib/storage/index.js:404:8)
at /Users/jkingyens/reach_system/Gruntfile.js:48:31
at /Users/jkingyens/reach_system/node_modules/async/lib/async.js:125:13
at Array.forEach (native)

@stephenplusplus
Copy link
Contributor

Use emitter.setMaxListeners() to increase limit.

Maybe we can just do this? http://nodejs.org/api/events.html#events_emitter_setmaxlisteners_n

@ryanseys
Copy link
Contributor

What limit would we set? Would this limit be sufficient for all cases and why?

This comment was marked as spam.

@ryanseys
Copy link
Contributor

The error is: warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit. so the question is, do we have a memory leak? I'm assuming not (sorry I'm not too familiar with this code). But if we also increase the limit, we are reducing our ability to find legitimate memory leaks in the future.

@stephenplusplus
Copy link
Contributor

No memory leak, just a low value set by default. I would set the limit to 0 for unlimited.

@stephenplusplus
Copy link
Contributor

@jkingyens can you show the code you used that triggered this error? I'm curious how so many connections were queuing up.

Regardless, I'm preparing a PR with setMaxListeners(0) and would appreciate if you could test it :)

stephenplusplus added a commit to stephenplusplus/gcloud-node that referenced this pull request Sep 18, 2014
Node's default limit for max listeners on an event emitter is at a low
10. We are using an event emitter internally in the Connection class to
handle queueing callbacks for the token retrieval event. The default of
10 is now lifted to unlimited, so the developer doesn't have to worry
about overriding the default themselves.

Fixes googleapis#223
@jkingyens
Copy link
Author

I am queuing up requests for GCE account activity available in daily JSON records from google cloud storage. Since its Sept 18th, I would have sent out 18 requests in parallel for the data. I don't want any uneccessary serialization as to keep latency for the query very low.

In general, I think using event emitters to buffer up async callbacks makes for less readable code. The buffering technique that I used here is the way I see things normally done and the advantage is its very readable. If you look at node.js modules like mongoose this is the way they do things. If you make a mongodb request before a mongo connection becomes active, it queues using this method.

https://github.com/LearnBoost/mongoose/blob/master/lib/connection.js#L447-L450

@jkingyens
Copy link
Author

The tests are failing as well so I would have to fix that too.

@stephenplusplus
Copy link
Contributor

We started out with a similar technique, but queuing behavior is built into Node with event emitters. No need to re-invent the wheel. this.on('connected', callback) to me is a perfect use for queueing, and equally readable.

Happy to hear other opinions, however.

@jkingyens
Copy link
Author

IMO, its not built-in. If it was, it wouldn't need a workaround like setMaxListeners(0) in order to make it work correctly. What you are essentially doing is turning off the node.js warning mechanism that something isn't quite right. Does that sound good to you?

@stephenplusplus
Copy link
Contributor

By default EventEmitters will print a warning if more than 10 listeners are added for a particular event. This is a useful default which helps finding memory leaks. Obviously not all Emitters should be limited to 10. This function allows that to be increased. Set to zero for unlimited.

Google around. It's a known ridiculous low limit.

@jkingyens
Copy link
Author

Fair enough. I just think the large accumulation of events in a single event emitter object is a bad smell and is why these limits are in place and haven't been removed at any point due to some kind of annoyance. Its probably more personal preference then anything else and I just want to get off my fork and have the problem go away, so whatever works is fine with me :)

@rakyll rakyll closed this in #225 Sep 20, 2014
@jkingyens
Copy link
Author

thanks!

@rakyll
Copy link
Contributor

rakyll commented Sep 20, 2014

We need to address the setMaxListeners(0) concern, but published this hotfix as 0.7.1.

sofisl pushed a commit that referenced this pull request Nov 16, 2022
sofisl pushed a commit that referenced this pull request Nov 17, 2022
sofisl pushed a commit that referenced this pull request Nov 18, 2022
* chore: update jsdoc - protos and double quote

* chore: update jsdoc - protos and double quote

Co-authored-by: Alexander Fenster <github@fenster.name>
miguelvelezsa pushed a commit that referenced this pull request Jul 23, 2025
sofisl pushed a commit that referenced this pull request Jan 27, 2026
… node can be used with profiler. (#223)

* fix: make version restrictions a warning instead of an error

* coerce version
sofisl pushed a commit that referenced this pull request Jan 27, 2026
… node can be used with profiler. (#223)

* fix: make version restrictions a warning instead of an error

* coerce version
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants