-
Notifications
You must be signed in to change notification settings - Fork 1
cache: serialize data emit #85
Copy link
Copy link
Closed
Labels
flag:visibility-3-mostIssue will be seen by most usersIssue will be seen by most usersresolution:fixedRead as "fixed OR done OR implemented OR answered"Read as "fixed OR done OR implemented OR answered"severity: 4 (important)status:resolvedtype:bugSomething isn't workingSomething isn't working
Metadata
Metadata
Assignees
Labels
flag:visibility-3-mostIssue will be seen by most usersIssue will be seen by most usersresolution:fixedRead as "fixed OR done OR implemented OR answered"Read as "fixed OR done OR implemented OR answered"severity: 4 (important)status:resolvedtype:bugSomething isn't workingSomething isn't working
In issue #80 and issue #81, the
ads-github-cachetool was given the ability to concurrently fetch data from the remote GitHub v3 API, which sped up theupdateoperation considerably. However, we inadvertently broke operations that emitted the cached data, when operating in the "online" cache mode (which basically follows a pull-data-then-print approach). This can result in some of the cached data being emitted by several processes simultaneously, with the output intermixed (and therefore broken). Such behavior can sometimes be observed by simply piping the JSON data from an endpoint path into thejqtool:The problem noted in the
jqerror message is that the JSON structure has been corrupted; the particular error could be different, depending on the how the output happens to be intermixed on any given run of the program.If we artificially limit the number of background processes to one, then we effectively serialize the output and works around the problem.