What version of Vivado is supported for 2.4.7

Installation documentation still talks about 2019.2 but I am assuming that the documentation not been updated.

Hi,

Note: v2.4.7 is still in pre release, if you use it you do so at your own risk.

Xilinx Vivado 2019.2 remains the recommended Vivado version as of 31st January 2024. Depending on what you are doing, newer Vivados may or may not work.

There are three main barriers to updating the Vivado version:

  • Each Vivado update generally also includes a kernel version update.
    • This can cause issues with the OpenCPI kernel driver.
  • AMD Xilinx arbitrarily choose to move / change the directory structure of their tools and the outputs of their tools.
    • This has a habit of breaking various build scripts.
  • AMD Xilinx has a habit of adding segfaults somewhat at random to any of their tools.
    • This is particularly true of xsim, and even more so if you are simulating VHDL or cosimulating.

2021.1+ currently do not work on hardware due to issues with the kernel driver (AFAIK). There are various issues tracking the inclusion of 2021.1:

There is a current open MR as of this week to add 2021.1 support:

I believe 2020.1 and 2020.2 will work for xsim and hardware, but you’ll need to create your own rccplatform and continue at your own risk. These versions are not explicitly supported, so you run the risk of hitting bugs that the CI has not encountered before. You will also be trusting my memory that these versions come before the kernel version that breaks 2021.1.


I’d also point you towards this issue which provides some guidance relating to simulating using xsim on the main supported OSes:

If you want to somewhat quickly verify that a particular vivado version can be used for xsim, you could use the tool I made available here:

To use this on Ubuntu 20.04 and check whether the testbias assembly compiles for xsim in v2.4.7 using Vivado 20XX.X:

sudo apt-get -y install git make podman
git clone https://gitlab.com/d9t2/ocpicontainer.git
cd ocpicontainer
make build-ubuntu20_04-opencpi-release-2.4.7-xilinx-20XX.X-testbias \
  from_xilinx_dir=/path/to/your/xilinx/root \
  opencpi_branches=release-2.4.7

Note: I’m suggesting use of Ubuntu 20.04 as Ubuntu 22.04 is currently broken as of this week due to Canonical changing gcc version for the kernel. There is a fix in testing, but ocpicontainer does not integrate it at this time.

If you use this tool and encounter difficulties (noting that it is a year old at this point), please reply here or open an issue. In particular, you may have to modify some of the variables at the top of the Makefile, although I have tried to pre-empt those edits in the command given above.

If it completes without error (about an hour wait), then testbias built successfully. You should then open the container and try to run it:

make open-ubuntu22_04-opencpi-release-2.4.7-xilinx-20XX.X-testbias

Note: This tool does not verify that synthesis or anything on hardware works, just building testbias for xsim.

Personally, I have recently used Vivado 2022.1 xsim on Arch Linux (my custom rccplatform), and it seemed to work just fine.

Thank you for the detailed response. Is the main issue a lack of companies contributing effort into porting OpenCPI to the latest tools and operating systems?

That’s a good question.

I’d say that there is obviously risk in attempting to use any open source project out of the defined, current lane that it is in. So, from a start there is probably a lack of incentive to investigate integrating newer tools, unless delays are acceptable.

Then there is the difficulty in making the updates which can vary depending on which area of the codebase is being addressed (this can require someone with significant experience with the framework to do it), and sometimes these updates cause problems that are really hard to diagnose. The kernel driver is a great example of that, and something that demands extensive testing on hardware.

Finally, there is the double edged sword of willingness to contribute patches to the framework, and the maintainers having the time to review, test, and merge the changes. This second point has improved over time with more community contributions getting merged in recent months (I had two PRs merged over the Autumn), but has been a serious bottleneck in the past.