
On 12/25/2012 3:08 PM, Dave Abrahams wrote:
on Tue Dec 25 2012, Rene Rivera <grafikrobot-AT-gmail.com> wrote:
On 12/17/2012 12:25 PM, Dave Abrahams wrote:
on Sun Dec 16 2012, Rene Rivera <grafikrobot-AT-gmail.com> wrote:
On Wed, Dec 12, 2012 at 9:49 AM, Beman Dawes <bdawes@acm.org> wrote:
On Tue, Dec 11, 2012 at 10:52 AM, Rene Rivera <grafikrobot@gmail.com> wrote:
Hm.. That's barely a step :-\ ..And there's no need to branch. The tools already support multiple transport methods so we can just add another. Which brings me to one of the transport methods regression testing currently supports.. Downloading the current trunk/release as a ZIP archive. I was hoping to use the github facility that exists for downloading ZIPs of the repos. But unfortunately I couldn't make it attached the contents of the indirect references to the library subrepos.
That's right, unfortunately. However, we can get the exact URLs of the ZIP files from the GitHub API. I've recently done some scripting with that, e.g. https://github.com/ryppl/ryppl/blob/develop/scripts/github2bitbucket.py#L40
In fact, I think someone has coded up what's needed to make a monolithic zip here: https://github.com/quarnster/sublime_package_control/commit/9fe2fc2cad9bd2e7...
After looking at both of those I see no point in using the github api (or additional structure data from sublime -- not totally sure where the submodule info comes from in this case though) for this as it provides no additional information than one can get from just parsing the ".gitmodules" file.
I'm pretty sure that's not correct. The .gitmodules file doesn't contain information about which commit to check out for each submodule.
Right it doesn't. But your ryppl code doesn't handle that either since it fetches the repos individually from the non-version-specific master branches (AFAICT). And the sublime code uses its own metadata files, ".sublime-package" and "package-metadata.json", to determine what to get. Although I can't tell if that contains specific version info. But since it also looks like it works with clone repos perhaps it doesn't need to worry about that.
Hence the complexity of supporting testing with ZIPs is now a magnitude larger as it means dealing with fetching more than a hundred individual repos :-(
Which now seems the only choice. At the tester side I will have to get the boost-master archive. Then parse out the ".gitmodules" file. And get each subrepo archive individually. Which increases the likelihood of failure considerably.
I'm not sure. Isn't it true that shorter transfers are more likely to succeed than longer ones?
Perhaps, if one happens to have an not reliable internet connection. But I would expect testers to have reliable connections. But that's a minor unreliability.. The more likely problem is in code bugs in the testing script ;-) -- -- Grafik - Don't Assume Anything -- Redshift Software, Inc. - http://redshift-software.com -- rrivera/acm.org (msn) - grafik/redshift-software.com -- 102708583/icq - grafikrobot/aim,yahoo,skype,efnet,gmail