1
I Use This!
Moderate Activity

Commits : Listings

Analyzed 1 day ago. based on code collected 1 day ago.
Sep 22, 2024 — Sep 22, 2025
Commit Message Contributor Files Modified Lines Added Lines Removed Code Location Date
Remove duplicate [build-system] section. Build wheel along with sdist too More... 11 days ago
Currently, we use the build.properties file to store the Pegasus version. It is used across Pegasus from this file. This creates a problem for the Python packages, since the file is outside the directory containing the pyproject.toml file. So, binary wheels would always show the version as 0.0.0. More... 11 days ago
For #2124 comment update More... 11 days ago
Merge branch '5.1' of https://github.com/pegasus-isi/pegasus into 5.1 More... 11 days ago
For #2124 some internal renaming of variable. the mkdir case was no longer applicable. More... 11 days ago
Remove build for Debian 11 More... 11 days ago
Merge branch '5.1' More... 11 days ago
Use an SPDX expression in the license field of setup.py to prevent the warning below More... 12 days ago
Add build for Debian 13 (Trixie) More... 12 days ago
Merge branch '5.1' More... 12 days ago
Run update-requirements-txt script to update Python dependency versions More... 12 days ago
Add a script to update the requirements.txt file to use the latest package version available for each supported Python version. It excludes PyYAML, GitPython, and cryptography, as we expect those to be installed using dnf/apt etc. More... 12 days ago
Fix github url and replace Bamboo with GitLab CI More... 12 days ago
Merge branch '5.1' More... 12 days ago
Merge branch 'master' of https://github.com/pegasus-isi/pegasus More... 12 days ago
For #2123 multiple transfer_attempts and integrity_verification_attempts multipart records can exist in the same job.out file. While creating the job composite event, if a key exists we make sure we merge it into the existing data structure More... 12 days ago
For #2123 multiple transfer_attempts and integrity_verification_attempts multipart records can exist in the same job.out file. While creating the job composite event, if a key exists we make sure we merge it into the existing data structure More... 12 days ago
Reduce requires coverage to 20% More... 12 days ago
Reduce requires coverage to 20% More... 12 days ago
Merge branch '5.1' More... 19 days ago
For #2120 the condor classad to set the wf submit dir in the job needs be set via a function in the Job class to ensure it gets set appropriatly for clustered jobs. More... 19 days ago
For #2120 some refactoring of the names. More... 19 days ago
Merge branch '5.1' More... 20 days ago
For #2122 ported changes to the new analyzer module too More... 20 days ago
For #2122 Add the initial dir to all files to be copied as long as they dont start with / or $(wf_submit_dir) in the submit file of the job being debugged More... 20 days ago
For #2122 substitute $(wf_submit_dir) with ${wf_submit_dir} in the generated debug shell script for a job More... 20 days ago
Merge branch '5.1' More... 21 days ago
Fixes #2121 . We no longer throw an error when planning a wf in condorio mode against a bosco site, where the pegasus style is set to ssh. More... 21 days ago
For #2120 only set the wf_submit_dir classad in the condor code generator. More... about 1 month ago
For #2120 we know what files are transferred from the worklfow submit dir via condorio. so being specific about it. substitution now does not happen for all files added via condorio; only the ones we know are from the wf submit directory. More... about 1 month ago