-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Versions / Tags for the repositories associated to the publication #2
Comments
|
Correct. Albert Rich had a XX.XX.XX version number for the Rubi engine and wanted to keep it that way. However, we needed another indicator for the version of the Mathematica user interface which is decoupled from the rules. Your suggestions: Suggestion 1I checked several JOSS publications and no one referenced the version explicitly. However, the links that are provided on the left side of the final pdf point to (a) the current repository and (b) the archived version which is exactly the 4.16.0.4 release (I had to give this tag when submitting the paper). Suggestion 2Yes, you are correct. I have rebuilt the PDF files for the current 4.16.0 Rubi engine and there is now a release for the PDF file catalog that is up-to-date and has the same version number as the Rubi engine. Suggestion 3Likewise, I tagged the commit from Aug 03 since this was the version we used to test Rubi 4.16.0. I'd like to postpone the tagging of the other test-suites until the next release since the latest comparison with other CAS was done in early summer and I am not entirely sure if Nasser used this version. However, the tests are only growing and their main purpose is to ensure that we have no regressions when Albert Rich implements new Rubi rules. We are currently discussing if we can acquire someone who can help to implement the very detailed integrator test program of Albert Rich for other CAS. This is a very tricky business because Mathematica is strong in symbolic computation and the question is what do you do when the built-in integrator gives a result but it is different from the optimal antiderivative? The trivial solutions to this are unfortunately sometimes hard to implement:
Now, try this when you use Mathematica's Suggestion 4I hope this is not necessary. Rubi only provides 3 user functions |
Addition to Suggestion 4To give my opinion on suggestion 4 some more weight: A similar case can be found in the fourth newest JOSS publication where the manuscript points to a specific release, but the user documentation lives separately on https://prestsoftware.com/ for the most recent version and it is cited in the manuscript. So while the archived version points to 0.9.8, the online documentation is at version 0.9.11. I'm not entirely sure, but I believe the main reason for using a tagged archived version is to get a DOI from Zenodo, which requires a GitHub release. |
Suggestion 1I made this suggestion on the basis of the following reviewer checklist item:
The GitHub release is 4.16.0.4, and if you provided the 4.16.0.4 tagged release with the submission, then I agree that this checklist item is satisfied. Thanks! Suggestion 2Resolved. Suggestion 3My suggestion to version the test suites is partly motivated by what might be a misunderstanding of their role in Rubi. If they are intended only as tests for Rubi itself, then it is probably not as important that they are versioned except as an indicator to the development team internally of which Rubi version passed which set of tests. But I am under the (possibly mistaken) impression that the test suites are used to compare the performance/functionality of Rubi against that of other CAS. It is this use case that I think necessitates versioning of the test suites. But if this used case is not intended to be a supported use case of the test suites, then I downgrade my suggestion from "recommended to be resolved" to "think about considering whether it's right for your project," and would consider this part of the issue resolved. :) If I understand you correctly, you have tagged the test suites used in Nasser's performance report and upon which the performance claims on the website/in the documentation are made. Identifying the test suite used specifically for these claims is sufficient in my view. Moreover, I now see that Nasser's performance report already includes download links to the specific test suites he used, so even a tag in the repository is not strictly necessary in my view (even if I personally think it's a good idea). Suggestion 4I am persuaded by your well-reasoned points and retract this suggestion. |
@rljacobson Thank you for being generous in understanding my points. As I said earlier, the test's main purpose is to verify Rubi and the tagged MathematicaTestSuite shows the state which was used for Rubi 4.16.0.4; the version we want to publish in the manuscript. The system for comparing different CAS is not in its final form because for many CAS we don't have a verification phase (proving that the antiderivative is correct) of the found result (I quote Nasser):
This is what I meant to be "fair" to other systems. In addition, the grading of the results, which is essentially comparing the results complexity to the known, optimal antiderivative is currently only available for Mathematica, Rubi, and Maple. This is the reason why we have recently switched the graphics on the website to show only these three systems instead of the whole bar-chart Here, it might appear if e.g. FriCAS was better than Maple but in truth we simply could not grade the results for the other systems. My hope for these comparisons is that we can (a) acquire some more manpower, especially people who bring Rubi to other systems and (b) generate some positive energy in the form of "why can't we do this?" This seems to work when you read messages like this one on the FriCAS forum or when you hear in one of the recent Wolfram Twitch streams that Wolfram is including some of Rubi's rules in their upcoming version. |
Which is why I think your test suite is a significant part of your project. The grading/verification problems aside, one could use them to compare computer algebra systems against one another independent of the Rubi rules. Those two examples in your last sentence really demonstrate the advantage of open source software well, in my view. Both examples are quite encouraging. |
This JOSS submission covers the following content:
The version of Rubi under review is v4.16.0.4. The developer wiki explains that this version number includes both the integration rules (the "Engine") and the Mathematica package (the "Interface") simultaneously:
I recommend the following version-related issues be resolved:
It is best practice for the documentation to indicate which version of the software it applies to. I think this is especially important when the distribution of the documentation is decoupled from the software. (However, the documentation need not be versioned independently from the software it documents; it would be confusing to have "Rubi Documentation v11.2.0 for Rubi v4.16.0.4.").Retracted.The text was updated successfully, but these errors were encountered: