<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="https://davidezordan.github.io/feed.xml" rel="self" type="application/atom+xml" /><link href="https://davidezordan.github.io/" rel="alternate" type="text/html" /><updated>2026-01-09T21:33:35+00:00</updated><id>https://davidezordan.github.io/feed.xml</id><title type="html">Davide Zordan</title><subtitle>Senior Software Engineer</subtitle><entry><title type="html">Slides and code from my session about AR/VR and point clouds at Global XR Conference 2022</title><link href="https://davidezordan.github.io/global-xr-conference-2022-getting-started-real-time-point-clouds-ar-vr/" rel="alternate" type="text/html" title="Slides and code from my session about AR/VR and point clouds at Global XR Conference 2022" /><published>2022-11-06T12:59:00+00:00</published><updated>2022-11-06T12:59:00+00:00</updated><id>https://davidezordan.github.io/global-xr-conference-2022-getting-started-real-time-point-clouds-ar-vr</id><content type="html" xml:base="https://davidezordan.github.io/global-xr-conference-2022-getting-started-real-time-point-clouds-ar-vr/"><![CDATA[<p style="text-align: left;">I have just uploaded the slides related to my session <em>Getting Started with Point Clouds and AR/VR</em> at <a href="https://globalxrconference.com/" target="_blank" rel="noopener">Global XR Conference 2022</a>.
</p>

<figure><img src="../assets/images/posts/2022/11/tec37.png" /></figure>

<p align="center">
<iframe src="//www.slideshare.net/slideshow/embed_code/key/CPBjWToGHaZbTR" width="595" height="485" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen=""> </iframe>
</p>

<p>The full video of the presentation is available on <a href="https://www.youtube.com/watch?v=fLJ_pID_-cA" target="_blank" rel="noopener">YouTube</a>.</p>

<p>The source code is available on <a href="https://github.com/davidezordan/remote-telepresence-vr" target="_blank" rel="noopener">GitHub</a>.</p>]]></content><author><name>davidezordan</name></author><category term="blog" /><category term="Unity" /><category term="Virtual Reality" /><category term="Augmented Reality" /><summary type="html"><![CDATA[I have just uploaded the slides related to my session Getting Started with Point Clouds and AR/VR at Global XR Conference 2022.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://davidezordan.github.io/assets/images/posts/2022/11/tec37.png" /><media:content medium="image" url="https://davidezordan.github.io/assets/images/posts/2022/11/tec37.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Remote Telepresence using AR and VR</title><link href="https://davidezordan.github.io/remote-collaboration-ar-vr/" rel="alternate" type="text/html" title="Remote Telepresence using AR and VR" /><published>2022-07-28T12:59:00+00:00</published><updated>2022-07-28T12:59:00+00:00</updated><id>https://davidezordan.github.io/remote-collaboration-ar-vr</id><content type="html" xml:base="https://davidezordan.github.io/remote-collaboration-ar-vr/"><![CDATA[<p><img src="/assets/images/posts/2023/remote_telepresence.jpg" alt="Screenshot" /></p>

<p><em>Combining Augmented and Virtual Reality for Remote Collaboration</em>.</p>

<p>This project was developed as part of my Master’s Degree dissertation.
The sample enables real-time point clouds transmission of a 3D scene captured with a mobile device to a VR headset.</p>

<p>More details about the implementation and findings have also been described in this session at the <a href="https://www.youtube.com/watch?v=fLJ_pID_-cA">Global XR Conference 2022</a>. Slides are available <a href="https://davide.dev/global-xr-conference-2022-getting-started-real-time-point-clouds-ar-vr/">here</a>.</p>

<p>Point clouds acquisition has been adapted from the project <a href="https://github.com/TakashiYoshinaga/iPad-LiDAR-Depth-Sample-for-Unity">iPad LiDAR Depth Sample</a>.</p>

<p>Full source code available on <a href="https://github.com/davidezordan/remote-telepresence-vr">GitHub</a>.</p>]]></content><author><name>davidezordan</name></author><summary type="html"><![CDATA[Remote Telepresence using AR and VR]]></summary></entry><entry><title type="html">Slides from my session at Global XR Conference 2021</title><link href="https://davidezordan.github.io/global-xr-conference-getting-started-unity-ar-vr-reloaded/" rel="alternate" type="text/html" title="Slides from my session at Global XR Conference 2021" /><published>2021-12-02T12:59:00+00:00</published><updated>2021-12-02T12:59:00+00:00</updated><id>https://davidezordan.github.io/global-xr-conference-getting-started-unity-ar-vr-reloaded</id><content type="html" xml:base="https://davidezordan.github.io/global-xr-conference-getting-started-unity-ar-vr-reloaded/"><![CDATA[<p style="text-align: left;">I have just uploaded the slides related to my session <em>Getting started with Unity and AR/VR for the .NET developer... Reloaded!</em> at <a href="https://globalxrconference.com/" target="_blank" rel="noopener">Global XR Conference 2021</a>.
</p>

<p align="center">
<iframe src="//www.slideshare.net/slideshow/embed_code/key/JQkvam5dqVepO" width="595" height="485" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen="">
</iframe>
</p>

<p>The full video of the presentation is available on <a href="https://www.youtube.com/watch?v=fNJaF6DR5Cs" target="_blank" rel="noopener">YouTube</a>.</p>]]></content><author><name>davidezordan</name></author><category term="blog" /><category term="Unity" /><category term="Virtual Reality" /><summary type="html"><![CDATA[I have just uploaded the slides related to my session Getting started with Unity and AR/VR for the .NET developer... Reloaded! at Global XR Conference 2021.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://davidezordan.github.io/assets/images/posts/2020/AR-VR-Presentation.png" /><media:content medium="image" url="https://davidezordan.github.io/assets/images/posts/2020/AR-VR-Presentation.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Analysing Unity projects using NDepend v2021.1</title><link href="https://davidezordan.github.io/ndepend-unity-support/" rel="alternate" type="text/html" title="Analysing Unity projects using NDepend v2021.1" /><published>2021-09-04T20:24:00+00:00</published><updated>2021-09-04T20:24:00+00:00</updated><id>https://davidezordan.github.io/ndepend-unity-support</id><content type="html" xml:base="https://davidezordan.github.io/ndepend-unity-support/"><![CDATA[<p style="text-align: justify;">
Patrick, the creator of the code analysis tool NDepend, recently contacted me and asked if I could provide some feedback about the latest release, v2021.1. I was pleasantly surprised to find support for both Unity and mobile applications development, as explained in detail in the <a href="https://www.ndepend.com/whatsnew" target="_blank">release notes</a>.
</p>

<p style="text-align: justify;">
The new Unity application support is particularly noteworthy: it enables complete assessment of the C# scrips associated with a project to identify improvements that can be applied to the code. It is worth mentioning that performance is always a top priority in Unity applications; hence the availability of automated tools for highlighting enhancements is a fundamental part of the project lifecycle.
</p>

<p style="text-align: justify;">
For this reason, NDepend default settings have been updated not to highlight false positives: as an example, the <em>Fields should be declared as private rule</em> is now disabled for these projects as it is often better, for performance reasons, to use fields instead of properties (some excellent presentations about performance from the Unite Now 2020 conference are available <a href="https://www.youtube.com/watch?v=ZRDHEqy2uPI" target="_blank">here</a> and <a href="https://www.youtube.com/watch?v=EK8sX8oCQbw" target="_blank">here</a>).
Another improvement of v2021.1 includes the optimisation of assembly references. When a third-party or framework assembly referenced by some application assemblies is not found at analysis time, it is now built from its references instead of being reported as not found.
</p>

<p style="text-align: justify;">
To start, I downloaded the trial version from the official site, and installed the product in my machine:
<figure><img src="../assets/images/posts/2021/09/NDepend2021-Image1.PNG" /></figure>
</p>

<p style="text-align: justify;">
Additionally, a Visual Studio extension is available and can be installed for IDE integration:
<figure><img src="../assets/images/posts/2021/09/NDepend2021-Image2.PNG" /></figure>
</p>

<p style="text-align: justify;">
I am often using Jetbrains Rider as an editor for Unity applications; in this case I found convenient to rely on the standalone executable <em>VisualNDepend</em> for performing all the code analysis.
</p>

<p style="text-align: justify;">
To evaluate the new features available in the new release, I opened an old project I often use as a playground, which I know needs improvements from both a code and functionalities point of view. Using VisualNDepend, I selected the Visual Studio solution generated by Unity and chose the corresponding assembly <em>Assembly-CSharp</em> containing the custom scripts:
<figure><img src="../assets/images/posts/2021/09/NDepend2021-Image3.PNG" /></figure>
</p>

<p style="text-align: justify;">
After the report-generation phase completed, it was possible to access the related dashboard which enabled the visualisation of the results. By selecting the desired namespace, it was possible to highlight specifics:
<figure><img src="../assets/images/posts/2021/09/NDepend2021-Image4.PNG" /></figure>
</p>

<p style="text-align: justify;">
And from here, showing more details about the relations between the different classes using a dependency graph, which I personally find very informative to understand more about the project structure. From this visualisation, it was possible to better analyse the technical debt associated with the different parts:
<figure><img src="../assets/images/posts/2021/09/NDepend2021-Image5.PNG" /></figure>
</p>

<p style="text-align: justify;">
I found particularly useful the usage of the integrated CLinq feature which enabled different queries to be performed on the code issues using LINQ:
<figure><img src="../assets/images/posts/2021/09/NDepend2021-Image6.PNG" /></figure>
</p>

<p style="text-align: justify;">
At this stage, it was possible to quickly identify potential improvements in the various C# classes of the scripts, which can easily be refined using the integrated query tools.
</p>

<p style="text-align: justify;">
The integrated Unity support available in NDepend is a valuable tool to guarantee good code quality in projects: I will definitely integrate it into my development workflow.
</p>

<p style="text-align: justify;">
Happy coding!
</p>]]></content><author><name>davidezordan</name></author><category term="blog" /><category term="NDepend" /><category term="Unity" /><summary type="html"><![CDATA[Patrick, the creator of the code analysis tool NDepend, recently contacted me and asked if I could provide some feedback about the latest release, v2021.1. I was pleasantly surprised to find support for both Unity and mobile applications development, as explained in detail in the release notes.]]></summary></entry><entry><title type="html">Black Dungeon VR</title><link href="https://davidezordan.github.io/black-dungeon-vr/" rel="alternate" type="text/html" title="Black Dungeon VR" /><published>2021-07-28T12:59:00+00:00</published><updated>2021-07-28T12:59:00+00:00</updated><id>https://davidezordan.github.io/black-dungeon-vr</id><content type="html" xml:base="https://davidezordan.github.io/black-dungeon-vr/"><![CDATA[<p><a href="https://www.youtube.com/watch?v=qMDZpuavwaM"><img src="/assets/images/posts/2023/black-dungeon-vr-screenshot.png" alt="Screenshot" /></a></p>

<p>I built this simple game, targeting Oculus Rift and Windows Mixed Reality headsets, using Unity and the SteamVR SDK.</p>

<p>✓ Developed, from the ground up, all the game mechanics and components reaching a frame rate of 90fps to avoid motion sickness.</p>

<p>✓ Implemented all the VR locomotion and interactions using the SteamVR Unity SDK.</p>

<p>✓ Enhanced the project with speech recognition functionality for providing help to the user and improving presence in VR.</p>

<p>✓ Added Natural Language Processing (NLP) using Microsoft Azure Cognitive Services and Microsoft Azure Language Understanding (LUIS).</p>

<p>✓ Used assets from the Unity Assets Store for the environment.</p>

<p>More details about the implementation and findings have also been described in this session at the <a href="https://www.youtube.com/watch?v=sFY-vrZmHow">NDC London 2021 Conference</a>. Slides are available <a href="https://davide.dev/ndc-london-2021-getting-started-unity-ar-vr/">here</a>.</p>]]></content><author><name>davidezordan</name></author><summary type="html"><![CDATA[Black Dungeon VR]]></summary></entry><entry><title type="html">Slides and code from my session at NDC London 2021</title><link href="https://davidezordan.github.io/ndc-london-2021-getting-started-unity-ar-vr/" rel="alternate" type="text/html" title="Slides and code from my session at NDC London 2021" /><published>2021-01-27T13:24:00+00:00</published><updated>2021-01-27T13:24:00+00:00</updated><id>https://davidezordan.github.io/ndc-london-2021-getting-started-unity-ar-vr</id><content type="html" xml:base="https://davidezordan.github.io/ndc-london-2021-getting-started-unity-ar-vr/"><![CDATA[<p style="text-align: left;">I have just uploaded slides and samples related to my session <em>Getting started with Unity and AR/VR for the .NET developer </em> at <a href="https://ndc-london.com" target="_blank" rel="noopener">NDC London 2021</a>.
</p>

<iframe src="//www.slideshare.net/slideshow/embed_code/key/7SgRx603u1UCc5" width="595" height="485" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen=""></iframe>

<p>The source code is available on GitHub:</p>

<ul>
  <li><a href="https://github.com/davidezordan/MixedRealitySamples/tree/master/SteamVR%20Demo" target="_blank" rel="noopener">Interactions and locomotion using SteamVR</a></li>
  <li><a href="https://github.com/davidezordan/CognitiveServicesSamples" target="_blank" rel="noopener">HoloLens object recogniser using Azure Cognitive Services</a></li>
  <li><a href="https://github.com/microsoft/MixedRealityToolkit-Unity" target="_blank" rel="noopener">Mixed Reality Toolkit</a></li>
  <li><a href="https://github.com/provencher/MRTK-Quest-Sample" target="_blank" rel="noopener">MRTK hand tracking sample using Oculus Quest by Eric Provencher</a></li>
</ul>]]></content><author><name>davidezordan</name></author><category term="blog" /><category term="Unity" /><category term="Virtual Reality" /><summary type="html"><![CDATA[I have just uploaded slides and samples related to my session Getting started with Unity and AR/VR for the .NET developer at NDC London 2021.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://davidezordan.github.io/assets/images/posts/2020/AR-VR-Presentation.png" /><media:content medium="image" url="https://davidezordan.github.io/assets/images/posts/2020/AR-VR-Presentation.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Slides and code from my session about AR/VR at DDD North 2020 conference</title><link href="https://davidezordan.github.io/ddd-north-2020-getting-started-unity-ar-vr/" rel="alternate" type="text/html" title="Slides and code from my session about AR/VR at DDD North 2020 conference" /><published>2020-02-27T22:57:00+00:00</published><updated>2020-02-27T22:57:00+00:00</updated><id>https://davidezordan.github.io/ddd-north-2020-getting-started-unity-ar-vr</id><content type="html" xml:base="https://davidezordan.github.io/ddd-north-2020-getting-started-unity-ar-vr/"><![CDATA[<p style="text-align: left;">I've just uploaded the slides and samples related to my session <em>Getting started with Unity and AR/VR for the .NET developer </em> at <a href="https://www.dddnorth.co.uk/" target="_blank" rel="noopener">DDD North 2020 conference</a>.
</p>

<iframe src="//www.slideshare.net/slideshow/embed_code/key/qSGfIFHEktnriE" width="595" height="485" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen=""> </iframe>

<p>The source code is available on GitHub:</p>
<ul>
  <li><a href="https://github.com/davidezordan/MixedRealitySamples/tree/master/SteamVR%20Demo" target="_blank" rel="noopener">Interactions and locomotion using SteamVR</a></li>
  <li><a href="https://github.com/davidezordan/CognitiveServicesSamples" target="_blank" rel="noopener">HoloLens object recogniser using Azure Cognitive Services</a></li>
  <li><a href="https://github.com/provencher/MRTK-Quest" target="_blank" rel="noopener">MRTK-Quest - Mixed Reality Toolkit Extensions for Oculus Quest</a></li>
</ul>]]></content><author><name>davidezordan</name></author><category term="blog" /><summary type="html"><![CDATA[I've just uploaded the slides and samples related to my session Getting started with Unity and AR/VR for the .NET developer at DDD North 2020 conference.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://davidezordan.github.io/assets/images/posts/2020/01/DDD_North_2020.png" /><media:content medium="image" url="https://davidezordan.github.io/assets/images/posts/2020/01/DDD_North_2020.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry><entry><title type="html">Rube Goldberg VR</title><link href="https://davidezordan.github.io/rube-golberg-vr/" rel="alternate" type="text/html" title="Rube Goldberg VR" /><published>2020-01-20T12:59:00+00:00</published><updated>2020-01-20T12:59:00+00:00</updated><id>https://davidezordan.github.io/rube-golberg-vr</id><content type="html" xml:base="https://davidezordan.github.io/rube-golberg-vr/"><![CDATA[<p><img src="/assets/images/posts/2020/rube-goldber-vr.png" alt="Screenshot" /></p>

<p>I built this simple game, targeting Oculus Rift and Windows Mixed Reality headsets, using Unity and the SteamVR SDK.</p>

<p>The goal is to collect all the stars available in the environment with a single launch of the ball from the platform and then reach the goal target.</p>

<p>The user can navigate the environment using the Oculus touch controllers and teleportation using the left hand controller thumbstick. The object selection menu can be activated with the right thumbstick. Trigger controls permit to interact and interact with objects using the trigger buttons.</p>

<p>Full source code available on <a href="https://github.com/davidezordan/RubeGoldberg">GitHub</a>.</p>]]></content><author><name>davidezordan</name></author><summary type="html"><![CDATA[Black Dungeon VR]]></summary></entry><entry><title type="html">Welcome to NYC</title><link href="https://davidezordan.github.io/welcome-to-nyc/" rel="alternate" type="text/html" title="Welcome to NYC" /><published>2019-11-19T12:59:00+00:00</published><updated>2019-11-19T12:59:00+00:00</updated><id>https://davidezordan.github.io/welcome-to-nyc</id><content type="html" xml:base="https://davidezordan.github.io/welcome-to-nyc/"><![CDATA[<p><img src="/assets/images/posts/2019/Welcome-to-NYC.png" alt="Screenshot" /></p>

<p>A simple VR scene tergeting mobile headsets (Oculus Go) illustrating interactions with Virtual Characters and Plausibility Illusion.</p>]]></content><author><name>davidezordan</name></author><summary type="html"><![CDATA[Welcome to NYC]]></summary></entry><entry><title type="html">Slides and code from my Virtual Reality session at DDD Reading 2019</title><link href="https://davidezordan.github.io/slides-and-code-from-my-virtual-reality-session-at-ddd-reading-2019/" rel="alternate" type="text/html" title="Slides and code from my Virtual Reality session at DDD Reading 2019" /><published>2019-10-09T20:55:00+00:00</published><updated>2019-10-09T20:55:00+00:00</updated><id>https://davidezordan.github.io/slides-and-code-from-my-virtual-reality-session-at-ddd-reading-2019</id><content type="html" xml:base="https://davidezordan.github.io/slides-and-code-from-my-virtual-reality-session-at-ddd-reading-2019/"><![CDATA[<p style="text-align: left;">I've just uploaded the slides and samples related to my session <em>Getting Started with Unity and AR/VR for the .NET Developer&nbsp;</em>at <a href="https://www.developerdeveloperdeveloper.com/" target="_blank" rel="noopener">DDD Reading 2019</a>.</p>
<iframe src="//www.slideshare.net/slideshow/embed_code/key/xGYh9G5NTT2DUb" width="595" height="485" frameborder="0" marginwidth="0" marginheight="0" scrolling="no" style="border:1px solid #CCC; border-width:1px; margin-bottom:5px; max-width: 100%;" allowfullscreen=""> </iframe>

<p>The source code is available on GitHub:</p>
<ul>
  <li><a href="https://github.com/davidezordan/MixedRealitySamples/tree/master/SteamVR%20Demo" target="_blank" rel="noopener">Interactions and locomotion using SteamVR</a></li>
  <li><a href="https://github.com/davidezordan/CognitiveServicesSamples" target="_blank" rel="noopener">Cognitive Services samples</a></li>
</ul>]]></content><author><name>davidezordan</name></author><category term="blog" /><category term="Unity" /><category term="Virtual Reality" /><category term="HoloLens" /><summary type="html"><![CDATA[I've just uploaded the slides and samples related to my session Getting Started with Unity and AR/VR for the .NET Developer&nbsp;at DDD Reading 2019.]]></summary><media:thumbnail xmlns:media="http://search.yahoo.com/mrss/" url="https://davidezordan.github.io/assets/images/posts/2019/10/DDD_DeveloperDay_2019.png" /><media:content medium="image" url="https://davidezordan.github.io/assets/images/posts/2019/10/DDD_DeveloperDay_2019.png" xmlns:media="http://search.yahoo.com/mrss/" /></entry></feed>