### XNA and Farseer Physics Engine 3.3 Presentation at Nokia

The slides and demos from the presentation at Nokia can be found here.

Thanks to everyone that showed up!

- Get link
- Google+

The slides and demos from the presentation at Nokia can be found here.

Thanks to everyone that showed up!

- Get link
- Google+

I often find myself needing a compression library in a software development project, and after a few tests with disappointing results I usually fall back to spawning 7Zip as an external process. This time however, the application specification prohibits any data written to disk, which 7Zip needs to do in order to work correctly. I needed to do everything in memory, and so I had to test some of the C# libraries available for compression as well as the different compression algorithms.

Here is a full list of the tested libraries:

.NET DeflateStreamICSharpCode SharpZipZlib.NETDotNetZipManaged LZMALZMA SDKQuickLZ Compression 101 Before I get to the results of the benchmark, I think it is important to understand the different archive formats and compression algorithms in order to compare the results with other benchmarks/and or tests. Lets get to it!

A compression algorithm can be compared to a small machine that takes in some data, do some math on it and transform it to some new data -…

Here is a full list of the tested libraries:

.NET DeflateStreamICSharpCode SharpZipZlib.NETDotNetZipManaged LZMALZMA SDKQuickLZ Compression 101 Before I get to the results of the benchmark, I think it is important to understand the different archive formats and compression algorithms in order to compare the results with other benchmarks/and or tests. Lets get to it!

A compression algorithm can be compared to a small machine that takes in some data, do some math on it and transform it to some new data -…

One of the important algorithms in vector graphics and robotics is the Ramer-Douglas-Peucker algorithm. It simplifies a polygon by reducing the number of points by some tolerance. It was invented independently by Urs Ramer in 1972 and by David Douglas & Thomas Peucker in 1973. In 1992 it was improved by John Hershberger & Jack Snoeyink at the cost of the algorithm only working with simple 2D planar polylines, and not in higher dimentions.Here is how it roughly works: 0. Start with a polygon.1. Make a polyline between the two endpoints. This is the initial approximation to the simplified polygon.2. Calculate the distance between the edge and the remaining vertices. If the vertex tested is further away that the user defined tolerance, then the vertex is added to the simplification.3. Repeat step 2 until you have a complete approximation of the polygon like shown in step 4 in the picture.The idea with the algorithm is that we can give it an input like the one to the left and get …

In an ongoing project of mine, I needed access to a good source of information that is free, powerful and provides some sort of API. My first thought was on Wolfram|Alpha, a powerful search engine that have thousands of databases filled with useful information and a really great interpreter that tries to guess what you mean by the 22th president of the USA. If you have not yet tried Wolfram|Alpha, I highly encourage you to take a look at it.

## Comments

You had a tough crowd, but it was a great presentation !

## Post a Comment