Build broken because of third party curl

Has anyone else noticed that the OpenCPI framework build / install is broken at the moment? (v 2.5.0-beta.1)

./scripts/install-opencpi.sh

The problem is that the command “curl -f -O -L https://www.mjr19.org.uk/sw/inode64.c” returns with a 403 ‘failed’ status

Investigation of this shows that a wget of this file does succeed and the curl command can be made to work by changing the user-agent string sent by curl to be the same user agent as sent by wget.

The site: www.mjr19.org.uk. seems to be Dr Michael Rutters hobby page and I’m wondering if he has impossed this limitation due to an unexpected rise in the amount of curl requests from his page?

Short answer: Yes.

This prerequisite is thought to only have been necessary to run modelsim on centos7.

An MR addressing this is in development here.

I’d suggest replicating the relevant changes from this MR, namely:

  • Delete the build/prerequisites/inode64 folder.

On most systems, that should be enough to do the initial build.

I’m wondering if he has impossed this limitation due to an unexpected rise in the amount of curl requests from his page?

That’s possible, although it has been a dependency for a very long time. Saying that, I think the location it has been pulled from has been changed at some point in the past.

Ultimately, one of two things is going to happen (note: I am not a maintainer, but I believe these are the two options):

  • The prerequisite is discovered to no longer be necessary, in which case it will simply be excised from the repo.
  • The prerequisite is necessary, in which case it will be mirrored onto a maintainer controlled server.

Update.

Seems like you might be right with curl requests being blocked, as wget works.

Also using curl with the wget user-agent identifier also works.

Yes, changing the user-agent is a work around. @waltersdom is correct on the current options we are considering for permanent fix.

Below is where you can change it in the prerequisite script to get through the install.

diff --git a/tools/scripts/setup-prerequisite.sh b/tools/scripts/setup-prerequisite.sh
index 2886681af..20eabb786 100644
--- a/tools/scripts/setup-prerequisite.sh
+++ b/tools/scripts/setup-prerequisite.sh
@@ -261,7 +261,7 @@ function download_url {
     local)
       echo Trying to downloading the distribution file locally from:  $2/$3
       echo Download command is: curl -f -O -L $2/$3
-      if curl -f -O -L $2/$file; then
+      if curl -f -O -L -A "Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/81.0" $2/$file; then
         echo Download completed successfully from $2/$3
        if [ -r $3 ] ; then
          [ "$3" != $file ] && mv -f $3 $file
@@ -277,7 +277,7 @@ function download_url {
     internet)
       echo Downloading the distribution file: $file
       echo Download command is: curl -f -O -L $url/$file
-      curl -f -O -L $url/$file && {
+      curl -f -O -L -A "Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/81.0" $url/$file && {
         echo Download complete.  Removing any existing build directories.
         unpack
        return

I asked the website owner (Dr Michael Rutter) and he has confirmed that he hasn’t intentionally blocked the cURL requests and he has now sent a support request to his web hosting company to see if it can be sorted

As of 2023-10-02T15:22:33Z, the issue appears to have been resolved. Dr. Michael Rutter must have come through on his end. Thanks @Icy for reaching out to him.