WEBVTT

00:00.000 --> 00:11.200
Our next speaker is Christoph Alexander.

00:11.200 --> 00:16.120
He is one of the authors of Nadja EDA.

00:16.120 --> 00:22.440
This is going to be an interesting new take on EDA using Python.

00:22.440 --> 00:25.840
So please give a warm welcome to Christoph.

00:25.840 --> 00:35.400
Does everyone, yeah, it's so high everybody I'm Christoph.

00:35.400 --> 00:44.680
So here in the EDA I'm going to talk about is not like KKK, the PCBDA, it's digital design

00:44.680 --> 00:48.560
EDA like designing chips category.

00:48.560 --> 00:55.480
So what we are trying here is let me introduce what we are doing.

00:55.480 --> 01:02.640
So it's the Nadja EDA package itself is quite young, we just launched in December 2024.

01:02.640 --> 01:10.840
And what we are chasing here is our simplification in the developments of EDA tools.

01:10.840 --> 01:14.000
The first thing is instability.

01:14.000 --> 01:21.280
So the package here is just a peepin-style Nadja EDA.

01:21.280 --> 01:33.440
But also simplification in API, so net least browsing how to write algorithms to be able

01:33.440 --> 01:39.680
to manipulate net least, so it can be P&R algorithms, physical implementation algorithms.

01:39.680 --> 01:47.240
But also data mining, which is something that we are working on currently, so how

01:47.240 --> 01:56.240
to interpret the different views that are used in EDA, net least view, timing view, physical

01:56.240 --> 02:03.680
view, all those informations how to be able to collect them and then provide them to data

02:03.680 --> 02:07.520
mining or AI engines.

02:07.520 --> 02:12.960
So where we are in the flow currently, so photos of you, you know, the open source

02:12.960 --> 02:19.320
EDA world, so we are at the net least level, so we take synthesized net least.

02:19.320 --> 02:27.760
So your sister outputs photos, we know the tool, and we are currently interleaved between

02:27.760 --> 02:31.320
your sister and open room.

02:31.320 --> 02:41.080
We put a large accent on the scalability and performance, so we are coming initially

02:41.080 --> 02:47.600
from a world where we have worked on very huge designs, so we want to be able to manage

02:47.600 --> 02:53.200
the biggest designs, if they are available.

02:53.200 --> 03:03.760
And we also emphasize high fidelity of the data, meaning when we take data as an input,

03:03.760 --> 03:11.120
it can be very long or whatever data, we try to preserve everything that is in this data,

03:11.120 --> 03:16.960
meaning that we don't take the data just as an input to algorithms and basically destroy

03:16.960 --> 03:23.160
everything that those algorithms won't need, we really try to keep everything that is there.

03:23.160 --> 03:31.880
One thing also that is possible with this tool today, it's already easier, so if you

03:31.880 --> 03:40.800
want to edit net least easily, just to have small or big changes, but it's very convenient

03:40.800 --> 03:45.120
for doing this.

03:45.120 --> 03:56.040
So Naja IDEA is the type of Naja iceberg, Naja IDEA is launched in December, but there

03:56.040 --> 04:05.200
is already quite things available in the Naja ecosystem, which is mostly a C++ ecosystem.

04:05.200 --> 04:10.360
For instance, there is this tool, I won't get too much into details, but that is used

04:10.360 --> 04:17.880
for netists, and mostly big netists optimization, in particular in the context where those

04:17.880 --> 04:24.240
netists have been synthesized accurately, because there is pending optimizations in that case.

04:24.240 --> 04:31.760
So this example here is the Megabomb design, which is RIS5 open source design, that is available,

04:31.760 --> 04:36.160
that is around 4 million gates after synthesis.

04:36.200 --> 04:42.960
And by applying that logical elimination or constant propagation on this netist, we are able

04:42.960 --> 04:50.240
to remove 20% of the design in 15 minutes, so it's just to explain that I mean, we are

04:50.240 --> 04:56.840
really working on doing that kind of thing efficiently.

04:56.840 --> 05:04.880
But Naja IDEA is not, let's say just a Python export of the C++ classes, like simple

05:04.880 --> 05:10.160
of wrapping, automatic wrapping, from C++ to Python.

05:10.160 --> 05:19.600
It's really a layer where we try to simplify the complexity that is underlying C++ ecosystem

05:19.600 --> 05:26.200
and try to present something and to take advantage of going to Python, to take advantage

05:26.200 --> 05:34.320
of this sort of translation or export, and to take advantage of this to also simplify some

05:34.320 --> 05:35.320
concepts.

05:35.320 --> 05:41.320
So for instance, in C++, we can have multiple views of the design, like physical timing,

05:41.320 --> 05:42.320
etc.

05:42.320 --> 05:46.320
So we try to, we merge them in the Python thing.

05:46.320 --> 05:52.800
Also when you have Yaki called netist, usually so you have the instantiation context, and

05:52.800 --> 05:57.840
then you have an instance, in its context, in its Yaki called context.

05:57.840 --> 06:02.840
So here we have only one instance object that is basically what could be called an

06:02.840 --> 06:07.840
Yaki called context.

06:07.840 --> 06:15.800
So there's a simplification of everything that is Yaki called traversal management.

06:15.800 --> 06:21.080
One thing is if you modify an instance, so an instance here is an instance in its context,

06:21.080 --> 06:29.960
so everything related to the unification of parents, parents, depending on is this instance

06:29.960 --> 06:35.320
or a unique instance or is instantiation multiple times, so all this unification thing

06:35.320 --> 06:38.720
will be managed automatically.

06:38.720 --> 06:48.320
Same for term and net, so top terms, instance terms, all this is merged, thus mismanagement

06:48.320 --> 06:55.000
or single bits, also this is simplified a lot, so you can connect, if there are multiple

06:55.000 --> 06:59.920
bits or a single bit, this is simplified.

06:59.920 --> 07:05.520
Then if I have something, I will talk about Equipotential, which is basically a transparency

07:05.520 --> 07:13.400
archicorn net across the design, so just now some examples, so if you want basically

07:13.400 --> 07:24.640
to load a design into this, so it's just a few lines, so we try to support the, let's say

07:24.640 --> 07:34.000
the AZ-Cworld, which is more liberty and very large approach and from the FPGA world,

07:34.000 --> 07:43.600
which usually does not have liberty, so we have an internal library of primitive, so

07:43.600 --> 07:51.640
we spot exactly, and then you can dump the very large, so in that case some people that

07:51.640 --> 07:56.520
have tried this archipice, because they think it's just some kind of textual tool,

07:56.520 --> 08:03.880
because there isn't, if you don't do anything, the outputs will look a lot like the input,

08:03.880 --> 08:09.640
and some people are also surprised, but it's, we keep the order, we keep, I don't know,

08:09.640 --> 08:13.840
the pragmas in the file, we can really keep everything.

08:13.840 --> 08:20.640
So now if you want to print the net list, in just a few lines, you can write that kind

08:20.640 --> 08:26.560
of code, so it will just print all the hard key, but all the instances in the design,

08:26.560 --> 08:33.960
each in the archical context, of course we can print whatever you want, this is just an example.

08:33.960 --> 08:40.960
We also support visitors, so currently we have instance revisitor, which allows you to do

08:40.960 --> 08:48.120
whatever to implement callbacks, so if you want to visit instances with filters and apply

08:48.120 --> 08:56.080
some, some methods, some functions when you find those instances, you also have stop conditions

08:56.080 --> 09:05.840
if you want to, okay, so this is another example where you can extract easily, very in a few

09:05.840 --> 09:14.240
lines, the net list statistics, we are starting also to support for more complex statistics

09:14.320 --> 09:24.880
and connection to those, to Python packages like pandas, but also PyTorch, so to have more

09:24.880 --> 09:30.680
complex outputs, so this is just an example of the kind of thing you can get into, into

09:30.680 --> 09:38.680
pandas with this report, and this, I know it's small, but this is just to show in, in

09:38.680 --> 09:44.800
early lines, so this is a dead logical elimination of implemented in early lines, so this

09:44.800 --> 09:50.720
will remove all the dead logic in your design, it's just to show that you can implement

09:50.720 --> 09:59.480
that kind of algorithm and sketch it very easily, okay, so just now to explain what is the

09:59.480 --> 10:07.760
difference between a net and an equipotential, so when you have, so it's a very simple example

10:07.840 --> 10:13.760
here, you have three instances of the same model, and then here you have connectivity across

10:13.760 --> 10:22.600
the hockey, so if you look at a net, they're all in their context, so here you have seven

10:22.600 --> 10:33.280
nets, but now if you want to look at the connectivity flat without flattening, then you will

10:33.360 --> 10:42.200
have the equipotential object that will allow you to, without any transformation in the inside

10:42.200 --> 10:51.560
that, it's just some kind of visitor that virtually flattens and visits the hockey without

10:51.560 --> 10:58.520
virtually flattened, so you can see the connectivity across the design like this and go

10:58.520 --> 11:08.120
up and down, okay, so what are the next steps now, so we are working on adding physical

11:08.120 --> 11:18.480
view, so to be able to make the connection between logical view and the physical implementation,

11:18.480 --> 11:25.240
so we need to add the left-hand also, so we are working on this support, we are also working

11:25.320 --> 11:35.240
on, so this is coming from different sources, sometimes those sources are not having the same

11:35.240 --> 11:43.320
high hockey, usually the physical information is completely flat or has different high hockey

11:43.320 --> 11:49.400
physical hockey, so we are working on making the connection also automatically when we load this,

11:50.400 --> 11:57.800
and the idea is to be able to, our focus currently is how to, now that we have all this data

11:57.800 --> 12:08.040
internally, that is really structured without any loss, to be able to transfer it to those AI

12:08.040 --> 12:15.040
engines, yeah, thank you.

12:15.040 --> 12:22.760
So any questions?

12:22.760 --> 12:31.720
So as you are looking at transferring work flows, you're talking about potentially using these

12:31.720 --> 12:49.000
sorts of data for programmatic output, how do you, how do you judge inside of, inside of

12:49.080 --> 13:04.680
the, any type of feedback, is that something that you're depending on an external, an external

13:04.680 --> 13:18.920
tool to provide you with that information, or are you able to have a, an analysis with

13:19.160 --> 13:29.800
within the agenda that you have, the veracity that you would, would need to judge it,

13:29.800 --> 13:37.880
you know, good, good or bad or better or worse, okay, so just to understand the question before

13:37.880 --> 13:45.720
I repeat, are you talking about verification of, or qualifying, you mean in terms of quality

13:45.800 --> 13:53.480
of transformation results, like if you tell, okay, about placements or, so, right, so it,

13:53.480 --> 14:01.800
as your, your ongoing experiments here, you're, you're integrating, and, yeah, providing this,

14:01.800 --> 14:07.720
this bi, bi-directional work with, with other, with other data flows, each of these have,

14:08.200 --> 14:16.520
need potentially more, more information coming out, do you have a way of, of transferring

14:16.520 --> 14:23.560
that, that information about whether or not the, the circuit that you have designed that you built

14:23.640 --> 14:37.320
within innaja provides, what, you know, what, what would qualify that circuit as, as better,

14:37.320 --> 14:42.520
as better or worse, like, where the metrics that you're able to, to provide out to these,

14:42.520 --> 14:51.240
to these external tools? Okay, so, currently, we are really, I think we are not at this, at this

14:52.200 --> 15:00.680
level until now, we are at the point just to, how to generate, just to reach out, let's say, okay,

15:00.680 --> 15:09.560
so, how to generate data that is able to be transferred to those enzymes, even what will be

15:09.560 --> 15:15.160
the first experiment, what kind of, and I mean, the first experiment will be what kind of

15:15.240 --> 15:21.560
explanation of the data, those enzymes provide, like, if we provide, I don't know, different

15:21.560 --> 15:32.440
placement for instance, or different, uh, nephyst transformations, um, yeah, but, yeah, now, the, the, the

15:32.440 --> 15:38.280
feedback and how we qualify this feedback, this is completely open, apparently, I don't know if I

15:38.440 --> 15:44.280
sort of, I'm a little bit related, which use cases do you envision in Python for instance?

15:44.280 --> 15:52.440
But, Python, which use cases you think would be? So, there, so there are already some public

15:52.440 --> 16:02.120
experiments on, on this, but one is related to, uh, place and route, yeah, so, uh, but for this,

16:02.120 --> 16:11.960
you need the, uh, RTL, you need the net list, and then you need also to, some kind of, uh, photographic,

16:11.960 --> 16:19.640
or a physical, uh, view of the placement to be able to train like this. Now, it, it will this be,

16:20.440 --> 16:27.320
useful for, uh, floor planning, uh, placement, routing, this is completely, uh, well,

16:27.320 --> 16:31.720
not completely open because there's already some public experiments on this, but, uh, this is

16:31.800 --> 16:37.240
really something that is growing us in, uh, experimental, uh, field.

16:42.760 --> 16:44.280
Okay, thank you very much, Crystal.

