Add export & import for repeating instruments and events#210
Conversation
|
I don't think the failing checks are related to my changes, but rather to some server token permissions not granted for tests originating from a fork. @pwildenhain Any way I can fix this or is this an issue of the GH Actions configuration? |
|
You are correct, it's the fact that your account can't access repo secrets it's happening with the doctests, but what's confusing to me is that I thought I had configured these to skip on forks though, I must've messed up the logic If you check out .github/workflows/ci.yml specifically at line 40 you'll see the skip logic I'm talking about Looks like I'll have to tweak that to move this forward. Does everything work when you run pytest locally? |
|
Locally using pytest I get some pylint issues, not sure if these will also pop up in the CI. I addition the Regarding the conditional run of the doctests, maybe you have to use a different syntax ( |
|
Yea looks like we might need to use github.ref or github.actor As a test, can you change that line to only run if github.actor == 'pwildenhain' ? That's a little hard coded for my usual taste but at least we can get this PR to pass. Down the road I can change this to only run if the ref starts with redcap-tools or something like that (aka don't run if PR originated from a fork) |
|
It seems the token permission issue also occurs with the main tests. I had the same issue here and solved it very indirectly via the |
|
Hmmmm it's appropriately skipping the integration tests, but it looks like it's still trying to build the doctest fixture and that's where it's freaking out. Really appreciate all your contributions, I'll hopefully be able to provide a lot of fixes once I'm in the office today/and probably bleeding into tomorrow as well |
Previously the doctests would run by default with the `doctest-plus` option enabled. By removing that option, it will need to be added in whenever someone wants to run the doctests locally. But this makes it actually possible for the CI to skip the doctests when running from a fork
Since they are under a separate section in the API playground, i'd prefer to keep them separated here, also because they can export events as well as instruments.
|
See JuliaSprenger#1 for some proposed changes to this PR that I think will allow the CI to run 🤞🏻 |
♻️ PR modifications
|
I thought I fixed the |
Codecov Report
@@ Coverage Diff @@
## master #210 +/- ##
===========================================
- Coverage 100.00% 98.89% -1.11%
===========================================
Files 16 17 +1
Lines 531 541 +10
===========================================
+ Hits 531 535 +4
- Misses 0 6 +6
|
I ended up getting rid of 🎉 So awesome that the CI passes now 🙌🏻 Would you like to try your hand at adding some unit tests? If so I can walk you through the architecture -- if not I can add them myself. |
|
The tests fail with a |
|
Yup the tests are a bit of a maze! I'll do a write up here (maybe some version of this should live somewhere?), let me know if anything is unclear or there is somewhere you'd prefer that I step in to contribute. Something I can definitely do in the meantime is take the modifications you've made to the tests here and add them as integration tests rather than unit tests. I'll submit a PR for that shortly (edit: JuliaSprenger#2). Hold off on pushing anything until you're able to add that into this PR Testing ArchitectureThe tests are broken up into unit tests The integration tests connect to a real redcap server over at redcapdemo.vanderbilt.edu. I have a super user token which can create new projects to be used to testing. The xml files used in creating these test projects on the real server are under These are really nice to ensure that the package works on the real thing. But I can't share this super user token with others, so I need rely on the unit tests for others to use to test package functionality. Unit testsThe unit tests use the Requests are then routed to this mock server, which chooses the response data based off a handler function. With the You'll need to add a modify the For example the For the purposes of these endpoints, just testing the |
The current setup requires every new endpoint that could be returned as a data frame to be added here. That's bad form, let's just make the default be an empty dict
Similar to the prior change this was hard coded for record and metadata requests. But as this PR is demonstrating there a lot of other endpoints out there that allow imports and we certainly don't want to have to include each one in here. All that really seems to matter is whether or not there _is_ a named index. If there is, we want to keep it when we export. If there's not, then we don't need it. As you can see from the test, importing from a dataframe will likely still require some finneagling on the part of the user to ensure all the formats are correct.
Add doctests and integration tests for repeating forms
|
Hi @pwildenhain I copied some of the repeatingFormsEvents tests to be unit tests and added the corresponding lines for the export in the callback_utils. However, I am not sure how these utils are distinguishing between export and import. For this reason the |
|
You're so close here, that I am just gonna close the gap (see JuliaSprenger#3) and I'll add a more in-depth explanation below
This points to a problem with the REDCap API -- everything is a How REDCap distinguishes instead is the presence of the PyCap/tests/unit/callback_utils.py Lines 135 to 153 in 8a7d169 Basically we just check the for the It's kinda neat, gives a little bit of insight into how a real REDCap server needs to handle these types of requests -- but definitely not intuitive 😂 |
Fix unit test
|
I am not sure in which ways the not covered lines are related to the changes. @pwildenhain Any suggestions what to readd to cover those? |
|
I think I understand what's happening. Since the doctests don't get run with the regular unit tests, the I think all I need to do it add But that's not your problem, so I'll merge this and take care of that separately. Beautiful work here ✨ really appreciate you contributing, especially with all the architecture changing since the last time you contributed. Hope to see you back again 😉 |
|
Thanks for merging and all of the advise you gave in this PR. I learned quite a bit :) |
With the new test framework I am not sure how / where to integrate a dedicated test for these functions. Any advise on this?