Sie sind auf Seite 1von 24

Crocotta Research & Development Ltd

Be ambitious of climbing up to the difficult, in a manner inaccessible...

ONCE UPON ANOTHER FANTASTIC DAY IN THE UK


we started to think about visualization, other than polygonized surface rendering, to bring us closer to reality.

NEXT DAY WE TURNED TO OUR DEMON:


We quickly realized that lots of people had been dealing with the problem already*, so we had to set a more future oriented goal. What if we traveled 10 - 20 years ahead in time, and mimicked the real environment with the equipment of the future as much as possible.
Neither the demon criticized our project, no relevant search results were found.

* happens just too often

VIRTUAL UNIVERSE
A virtual world solely made of particles
The idea was born! BUT, there are fundamental questions: A. What is universe? B. What bottlenecks do we need to face? C. Can we do any part(s) of the experiment on todays machines?

A. WHAT IS UNIVERSE?
We want to know the building blocks and their interaction rules. Standard model of physics predicts 12 fundamental particles which interact via 4 elemental forces. Seems modelable, so far

HYDROGEN ATOM
is made of 2 up + 1 down quarks for a proton, and 1 lepton which is the electron. BUT: 6x1023 hydrogen atoms per dm3. Sounds less modelable

1. 2. 3. 4.

1014 atoms in a cell 109 cells per cm3 3x1011 stars in a galaxy Observable universe
Diameter is estimated at about 93 billion light-years. Contains 1024 stars (1 septillion stars). Approximate number of atoms is close to 1080.

Huh!!?

GIGANTIC NUMBERS ALL AROUND


Obviously we need to do some compromise here. Possible modeling options: Stay on subatomic/atomic level and model nanostructures Organic material provided that a living cell is the smallest element Galactic phenomena and have starts/planets as smallest elements

B. WHAT BOTTLENECKS?
Simulation speed: Particle interaction Measuring, scanning (visualization for instance) Even if we imagined 100,000 parallel cores, with fast common memory access, petabyte storage devices, etc., we could always enlarge/expand our simulation scenario to make the hardware struggle again. Amount of data: Obviously we are forced to think in smaller scale, even in 10 20 years term, as the amount of data is enormous.

C. WHAT CAN WE DO ON TODAYS MACHINES?


Well, probably a lot, because: If we designed the system scalable, we could deal with the problem - in small scale - straight away. Due to the enormous task we cant solely rely on hardware performance growth. We need to invent better algorithms anyway.

Lets start!

DEFINE AREAS OF DEVELOPMENT


We split up the work to 3 major areas: A. Scanning & visualization B. Physics C. Data compression, representation In the current presentation we focus on the Scanning & visualization part.

A. SCANNING & VISUALIZATION


PARTICLES IN 3D SPACE
We deal with many particles, so a raster representation may be more feasible than working with individual points (point clouds).

3D VOLUMETRIC TEXTURES
(similar to 2D textures + 1 extra spatial dimension)

Definitions: 2D textures have pixels 3D textures have voxels Texel means a pixel in 2D, a voxel in 3D.

Pros: Easy to scale up/down Opportunities for cheap interpolation, pattern reconstruction Cons: Difficult to scan, visualize Large data-size (empty space is also stored)

RAY-MARCHING
Instead of conventional intersection testing in ray-tracing, we march forward in tiny steps along the ray.

Pro: Can access all texels/matter Con: Damn slow

ACCELERATED RAY-MARCHING
Spatial data structures, adaptive grids: Binary-trees KD-trees Oct-trees Better, but still not effective enough.

ACCELERATED RAY-MARCHING
Sphere tracing distance fields The trick is to estimate the distance to the closest surface or sharp change in the volumetric texture at any point in space. This allows to march in large steps along the ray.

INTRODUCING GRADIENT FIELDS


Possible replacement for particles? Provided field construction vs. ray-marching speed up is a win. Is that possible? Yes. Weve been successfully deploying gradient fields, and not for visualization purposes only, but to accelerate physics calculations too. Further benefits: Scale extremely well (down/up). Give lots of opportunities for guessing, interpolating.

B. PHYSICS
Gradient fields can be well used for physics: Distance fields. Dramatic speed up at photon-tracing. Force fields, like gravity. Energy fields, like kinetic energy. etc.

C. COMPRESSION
The figure below highlights that a compression method has to be deployed.
Textures side
in texels 32 64 128 256 512 1024 2048 4096 8192 16384 32768 65536 131072

Size in bytes
side3 1 byte per texel

32K 256K 2M 16M 128M 1G 8G 64G 512G 4T 32T 256T 2P

Our failed approaches: Lossless compression Conventional lossy compression , like wavelet or similar Current approaches: Adaptive representation Focus on interesting areas Contour & pattern analysis Reconstruction

SUMMARY
We traversed an exciting path so far, and the next months are going to be even more exciting for us. We don't want to close out the possibility of 2 - 3 magnitudes speed up comparing to brute force methods, once we get all our theories into practice. And we hope our friends at the hardware department wont rest either

To be continued
Thank you!

Crocotta Research & Development Ltd


Suite 5, 39 Irish Town, Gibraltar

We are a small team of international researchers with the aim of conducting technology leaps in exciting fields of exploration like virtual reality, virtual synthesis of matter, artificial intelligence, and robotics.

www.crocotta.co.uk crocotta@crocotta.co.uk +44 20 3239 7007

Das könnte Ihnen auch gefallen