Mon Mar 1 14:55:05 2021
EVENTS
 FREE
SOFTWARE
INSTITUTE

POLITICS
JOBS
MEMBERS'
CORNER

MAILING
LIST

NYLXS Mailing Lists and Archives
NYLXS Members have a lot to say and share but we don't keep many secrets. Join the Hangout Mailing List and say your peice.

DATE 2015-06-01

LEARN

2021-03-01 | 2021-02-01 | 2021-01-01 | 2020-12-01 | 2020-11-01 | 2020-10-01 | 2020-09-01 | 2020-08-01 | 2020-07-01 | 2020-06-01 | 2020-05-01 | 2020-04-01 | 2020-03-01 | 2020-02-01 | 2020-01-01 | 2019-12-01 | 2019-11-01 | 2019-10-01 | 2019-09-01 | 2019-08-01 | 2019-07-01 | 2019-06-01 | 2019-05-01 | 2019-04-01 | 2019-03-01 | 2019-02-01 | 2019-01-01 | 2018-12-01 | 2018-11-01 | 2018-10-01 | 2018-09-01 | 2018-08-01 | 2018-07-01 | 2018-06-01 | 2018-05-01 | 2018-04-01 | 2018-03-01 | 2018-02-01 | 2018-01-01 | 2017-12-01 | 2017-11-01 | 2017-10-01 | 2017-09-01 | 2017-08-01 | 2017-07-01 | 2017-06-01 | 2017-05-01 | 2017-04-01 | 2017-03-01 | 2017-02-01 | 2017-01-01 | 2016-12-01 | 2016-11-01 | 2016-10-01 | 2016-09-01 | 2016-08-01 | 2016-07-01 | 2016-06-01 | 2016-05-01 | 2016-04-01 | 2016-03-01 | 2016-02-01 | 2016-01-01 | 2015-12-01 | 2015-11-01 | 2015-10-01 | 2015-09-01 | 2015-08-01 | 2015-07-01 | 2015-06-01 | 2015-05-01 | 2015-04-01 | 2015-03-01 | 2015-02-01 | 2015-01-01 | 2014-12-01 | 2014-11-01

Key: Value:

Key: Value:

MESSAGE
DATE 2015-06-02
FROM Ruben Safir
SUBJECT Subject: [LIU Comp Sci] 3d scanning and virtualization
From owner-learn-outgoing-at-mrbrklyn.com Tue Jun 2 14:01:03 2015
Return-Path:
X-Original-To: archive-at-mrbrklyn.com
Delivered-To: archive-at-mrbrklyn.com
Received: by mrbrklyn.com (Postfix)
id A7E8E1612F1; Tue, 2 Jun 2015 14:01:03 -0400 (EDT)
Delivered-To: learn-outgoing-at-mrbrklyn.com
Received: by mrbrklyn.com (Postfix, from userid 28)
id 8E8DB1612F0; Tue, 2 Jun 2015 14:01:03 -0400 (EDT)
Delivered-To: learn-at-nylxs.com
Received: from mailbackend.panix.com (mailbackend.panix.com [166.84.1.89])
by mrbrklyn.com (Postfix) with ESMTP id 8FC081612F0
for ; Tue, 2 Jun 2015 14:00:37 -0400 (EDT)
Received: from [10.0.0.19] (www.mrbrklyn.com [96.57.23.82])
by mailbackend.panix.com (Postfix) with ESMTPSA id 81C7113809;
Tue, 2 Jun 2015 14:00:36 -0400 (EDT)
Message-ID: <556DEF44.8000908-at-panix.com>
Date: Tue, 02 Jun 2015 14:00:36 -0400
From: Ruben Safir
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.7.0
MIME-Version: 1.0
To: hangout-at-mrbrklyn.com
Subject: [LIU Comp Sci] 3d scanning and virtualization
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Sender: owner-learn-at-mrbrklyn.com
Precedence: bulk
Reply-To: learn-at-mrbrklyn.com

Last night I went to a NewSpace Meeting at Columbia University. They
focus on Space Flight and exploration in the private sector, with a lot
of discussion about entrepreneurship in this field. Sean Casey, from
the Silicon Valley Space Center was the speaker. He did an excellent
presentation and really does fuel the imagination. I don't know how I
would approach the area, but they are interested in creating what for
all practical purposes is a user group for space technology. Khaki
Rodway and Michael Mooring is leading this. The asked if I could come
to Friday evening meeting, which I obviously I can't do.

One of the points of the discussion was the use of 3d printing on the
International Space Station, which is a great area of study. the other
though area that I found "reachable" was the talk of 3d VR. there has
been a lot of work in 3d VR over the last few decades. At one point we
were talking about an web standard VR technology that never drew wide
acceptance.


http://www.spacevr.co/?email=

This is an example of 3d proposes from the ISS.


it involves a kickstarter program.

This is from Forbes written by Mr Wolfe, one of the individuals I met
last night

~~~~~~~~~~~~~~~~


Rocket Scientist Launches Into Virtual Worlds



Josh Wolfe

Contributor

I write as VC on emerging technology, science & finance. Full bio


Opinions expressed by Forbes Contributors are their own.


Follow

*
on Forbes
(78)

* Follow on Twitter
* Follow on Facebook
* RSS Feed
* Website
* Profile
* Email

/Eric Romo is the CEO of Altspace VR, (Full disclosure: My venture firm
Lux Capital is an equity investor in Altspace.) a shared browsing
environment for virtual reality. Prior to Altspace, Eric was Founder of
GreenVolts, and was the 13^th employee at SpaceX. Eric graduated from
Cooper Union with a B.E. in Mechanical Engineering, and received an MS
in Mechanical Engineering and an MBA from Stanford University./

*Tell us a bit about your background.*

I grew up in New Jersey, did my undergraduate studies in Mechanical
Engineering at Cooper Union, and came to Stanford for my Masters, where
I focused on Thermal Sciences. After that, I joined SpaceX when that
company was very small. I was the 13^th employee there and helped design
and test rocket engines, which was a dream come true for me. I did that
for a couple of years and went back to Stanford for my MBA. When I was
there I got really excited about renewable energy and started a solar
technology that ended up over 6-1/2 years raising about $120 million in
venture capital, and growing the company to a little over 100 employees.
We had manufacturing operations in China and sales and marketing
partners all over the world.

I did that through the end of 2012, and got intellectually interested in
cognitive neuroscience, neuroplasticity and how our brains process
information. That intellectual meandering led me towards virtual reality
(VR) related technologies. At its core, virtual reality is technology
that tricks your brain into thinking that you’re someplace else or doing
other things. I started reading a lot about virtual reality, and then in
2013 I decided to start a company around it, Altspace VR. This was
before Oculus had raised even their Series A, let alone been acquired. I
made a bet that this was a direction that the world was going to head
in. Luckily we seem to be right about that bet so far. It’s been almost
two years now, and I’m very excited about what we’re working on.

*How did cognitive neuroscience get you interested in virtual reality?*

The first book that really caught my attention was “The Brain That
Changes Itself” by a Columbia professor and doctor, Norman Doidge, and
it’s really all about this new way of thinking about neuroplasticity
that has changed neuroscience over the last decade. It’s a very
different framework for thinking about the brain. That got me really
thinking. You can teach different parts of your brain to do different
things than they may have been wired to do in the past. Studies have
shown that people can lose one of their senses and the portion of their
brain that was previously used by that sense is then utilized for other
purposes. Other senses get heightened because of more brain capacity
being available. This concept of neuroplasticity led me to think, “Okay,
can you teach your brain to interface with other types of devices or
sensors or whatever it might be?” Virtual reality is exactly that:
giving your brain a different set of inputs than it knows how to use
naturally, but it will adapt to and use better over time.

*What was the vision when you first started your company and what is the
vision now?*

When we started, the vision was to be somehow involved in the VR market.
We looked at everything in the space from hardware to software and took
a broad approach to ask, “What’s the right place to be in this field?”
Our thesis was that the rise of VR would be inevitable, and that
technology will be everywhere. Part of our thesis was that a lot of the
hardware was going to be commoditized. If, then, you’re going to be in
software, does it make sense to start broadly or to be very specific
about which use case and application you’re going to support? If you’re
going to be in software – enterprise, or consumer – there is so much you
can do.

We ended up talking to a lot of different potential users for a variety
of different use cases and the theme that came up over and over and over
again was that the people were really emotionally impacted and excited
about the idea of ‘being together’ in a virtual space. VR gives us the
ability to feel like we’re in the same place together. The vision of
Altspace is to demonstrate how we can use VR to create social virtual
spaces, to make conversations more fulfilling than phone calls or Skype
sessions, and to enable VR to create a more emotional connection.

*What is AltSpaceVR right now?*

We’re a company that facilitates shared experiences in virtual reality.
We think that VR is going to be the most natural communications platform
that exists online. By ‘natural’, I mean there’s all sorts of things
about body language and eye contact and non-verbal communication that
are possibly in VR that are really hard in other mediums. Take a phone
call, for example, or a video call. It’s essentially impossible for two
people to look each other in the eye. In VR you can and you do feel like
you’re making eye contact with somebody, even if it is a digital
representation, or an avatar. There are natural ways that we interact
when we’re in a room together that are totally lost in video chat, and
even more so in audio chat conversations. So, a big part of what we do
is enabling natural communication.

Once we’ve enabled this natural communication, we need something to do.
Talking is great, but having a shared experience is a bit better. We’re
facilitating shared experiences by bringing the Internet into that
environment and saying “Okay, let’s build an experience around web
content that we can all share at the same time.”

*Can you give some examples of what that means?*

You may have had the experience where you and friends are hanging out
together and you want to watch YouTube videos and say “Oh, did you see
that video? Did you see this video?” This works well if you’re all
together in the same physical place because you can take turns showing
each other content on the same screen. But when you’re in different
places connected by screens, there’s something that’s lost about that.
You can’t hear each other laugh at the same time; you’re not sure who’s
going to go next. We’re building the ability for us to feel like we’re
in the same place together. That’s just one example of content. We can
build a synchronized experience around anything that’s on the web. It
might be you and your friends watching funny YouTube videos, or it might
be you and a significant other watching a Netflix movie when they’re far
away. However, 2D is not our ultimate end goal. While it’s fulfilling to
watch a Netflix movie or a YouTube video on a gigantic movie theater
sized display in virtual space, it would be even more fulfilling if that
was happening in 3D around us, with things popping out of the screen. So
we’ve also spent time working on new methodologies for bringing web
content out of the plane of 2D and into a 3D space, with the goal of
extending that shared experience into 3D as well.

*Can you briefly walk us through the Altspace experience?*

We did a closed beta test a few weeks ago. We had about 100 people using
our product, an application that runs on the computer that works with a
number of devices but currently works best with the Oculus Rift. You
launch the application, log in on your headset and you’re now in a new
virtual space. The first space that we call the Welcome Space, similar
to a conference room. From there, we have a hierarchy of spaces you
might want to go to. You press another button and instantly you might be
in the video space. Now you’re in a completely different environment,
you see lots of people running around and you can walk over, or teleport
over and say “I want to be over near those other people and I want to
talk to them.” So, you click a button and now you’re over near them, and
you can start having conversation. We have the ability of taking the
information from the headset and reflecting that extremely accurately on
your avatar.

One of the cool examples that we had in this closed demo is that we had
about 100 people from 15 different countries, and for whatever reason,
we had a large French contingent in the room. Inevitably what would
happen is a French person would come in the room and they’d hear other
people speaking in French, and all of a sudden there was a congregation
of French people all standing like they would at a party, standing
together speaking French and when they would meet each other – when they
would greet each other – because their headset tracked their
orientation, they started doing the classic French two-cheek kiss. One
of them came over and did this to me and it felt like that person just
tried to kiss my cheek because it feels so realistic.

dscf0152-1

*What is the size of your team and what are your hiring needs?*

We currently have about 15 people on our team right now and are based in
Redwood City*. *We’ve got a lot of openings on the engineering side. I
think what’s cool about what we’re doing is it’s difficult. There are a
lot of really tough problems to solve. We need smart engineers to come
help solve them. What’s exciting about being here is our technology is a
unique mix of existing game/3D technologies and web technologies, so we
can take people with those backgrounds and even though you might be a
front-end Java Script developer that works on traditional websites, you
can take those exact skills and then use them to build virtual reality
applications.

*How do you define success for Altspace?*

We’ll find ourselves successful when we have provided this amazing
social VR experience for people and we’ve found a lot of different
things for people to experience in a shared way. We’ll be successful if
we look out in a couple of years and there are dozens of different
activities that people are engaging in together on the Allspice platform.


~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Maybe we can cook up some ideas for this. Also, not involved in space,
3rd VR is becoming a hot topic for many useful engagements. The British
are looking at doing a rail museum using VR:

http://www.engadget.com/2015/06/01/london-mail-mail-vr/


How laser scans and VR are preserving London's hidden 'Mail Rail'

blogger-avatar by *Nick Summers*
| -at-nisummers
| 1 day ago

0



Deep in the heart of London, buried beneath 70 feet of soil and
concrete, lies a hidden underground railway. For almost 80 years, the
UK's "Mail Rail
"
transported letters and parcels between the capital's main post offices
and a few overground train stations, where they could then be delivered
across the country. It was a unique way to avoid street congestion, but
by 2003 the line had become uneconomical to run. The decision was made
to shut it down and it's laid dormant ever since, invisible to the public.

Now, the British Postal Museum & Archive
(BPMA) wants to open the Mail Rail
to the masses. The organization's plan is to open a new museum

near Mount Pleasant station and convert a section of the line into a
ride. It will, inevitably, mean making some changes to the railway as it
stands today. But before any renovations are made, the BPMA wants to
preserve the space with a digital archive. Rather than simply taking
some photos though, or moving the best artifacts into glass cabinets,
the organization opted for a technology called LIDAR. Similar to radar
or sonar, this process involves firing a laser in every direction and
measuring the time it takes to reflect off other objects. All of these
recordings then create a "point cloud," which specialist companies can
use to create 3D models. It's also the same technology that self-driving
cars use
to detect and analyze their surroundings.


Hiring the experts

To record such an unusual site, BPMA enlisted ScanLab Projects
. Based in London's Bethnal Green, the
company has used LIDAR to document a raft of spectacular places,
including the D-Day landing beaches in Normandy
, France; a shipping
gallery in
London's Science Museum; and parts of the Arctic Ocean
near Svalbard, Norway.
After capturing each location with the laser scanner, ScanLab goes over
them again with a DSLR camera. Back in the office, the team then
flattens the 3D model into 2D panoramas and lines them up with the DSLR
photos. The images from the laser scanner contain depth information,
meaning the colors captured by the DSLR can later be applied to the 3D
model.

"Conceptually, this removes the need to take a photograph and choose the
angle when you're at the location," ScanLab co-founder William Trossell
says. "You can come back into our office and spend months, or years
finding exactly the right perspective."

If any space deserves such meticulous treatment, it's the Mail Rail.
While it was operational, the carriages would carry up to 4 million
letters along 23 miles of track every day. It was the first driverless,
electrified railway and the only purpose-built underground mail transit
system in the world. The line was originally called the "Post Office
Underground Railway" and it launched in 1927, but the initial tunneling
work was actually completed a decade earlier. Its structural integrity
proved useful in World War I when it was used to protect art pieces from
The National Portrait Gallery, the Tate and the British Museum during
German bombing raids. In World War II, the network also doubled as
dormitories for post office staff.

Most importantly, the Mail Rail has been left untouched since its
closure. A few engineers still work on the line to check for water
damage and other structural problems, but otherwise nothing has been
moved. Royal Mail never planned to close the Mail Rail down completely,
so on the last "official" day in 2003, staff simply downed their tools
and left. They unknowingly created a near-perfect time capsule, a
snapshot in history.

ScanLab spent five days mapping the railway with two separate scanning
teams. Even now, the BPMA isn't sure how it'll use the data inside the
new museum. VR is one option, but the team is also considering mobile
apps. Visitors could hold their phones up at the walls, for instance,
and see the original space like a rift in the fabric of time. Parts
could also be used as projections during the ride, or as an alternative
experience for visitors with disabilities. "For people with
claustrophobia, or people that aren't comfortable with enclosed spaces,
it's not going to be a pleasant experience on the ride," a BPMA
spokesperson said. "However, we want them to be able to experience it,
so applications like this are some of the options we're now exploring to
try and bring that experience to them."

LIDAR data can be used for many different purposes. A surveyor might be
interested in the raw geographical information -- just a spreadsheet
with the numbers the LIDAR spat out. An architect, however, could
request a top-down plan of a building. "We can take the roof off the
structure and then pull the first floor away from the second floor --
almost architecturally dissect the building," Trossell adds. "Then it
becomes a good tool for investigative processes, where you're trying to
forensically re-examine a crime scene, or work out where the light
sockets are because you need to know where to put the new ones." Other
LIDAR and 3D visualization companies are doing similar work; Digital
Surveys, for instance, mapped a ship
called
the Northern Wave vessel to help engineers design new upgrades; Historic
Scotland and the Glasgow School of Art are scanning 10 historic
landmarks , including
five World Heritage Sites in Scotland, for preservation purposes.


Taking a trip in VR

LIDAR visualizations are rarely used in VR experiences though. That's
hardly a surprise, given VR is an emerging technology and major players
such as Oculus VR
,
Sony

and Valve
have
yet to release consumer hardware. But ScanLab has been pressing forward
and exploring how its model could be adapted for virtual reality. In its
spacious design studio in London's Bethnal Green, the company has rigged
up an Oculus Rift DK2 headset with plastic prongs and white balls
attached on top. Six cameras on the ceiling track their whereabouts and
replicate the users' movement inside the Mail Rail visualization.

The experience differs from typical VR demos because it shows an exact
reproduction of a real-world location, rather than a level from a video
game. The idea is that users will be drawn to the Mail Rail's nooks and
crannies and everyday objects knowing that, over a decade ago, real
people were interacting with them. Walking through the model from the
same perspective as an employee should, in theory, help people to
visualize what it must've been like down there, especially during the
two World Wars.

Gallery | 15 Photos


Mail Rail in VR

*

*

*

*

*

*


+ See all 15

For now, ScanLab is only loading a portion of its 3D model inside the
Oculus Rift. Booting up the entire visualization, at least with their
current hardware, would involve too much processing. Not that it really
matters -- ScanLab's motion-tracking setup is in the middle of its
office, so testers can only walk three or four steps before bumping into
tables and chairs anyway. At such close quarters, the quality of the
model isn't perfect either. Everything looks just a tad grainy, like an
analog TV that hasn't been tuned correctly. In addition, ScanLab can
only load a single LIDAR scan at once. It means that if you look in
places that, at the time of capture, were blocked by other objects in
front of the scanner, you'll sometimes see black "data shadows."
However, this experience is only an experiment -- a version for the
museum would no doubt incorporate a more complete model.

VR is an immersive way to experience any 3D space. But ultimately, the
work BPMA and ScanLab have done goes beyond a cumbersome set of goggles.
They have digitally archived a place that few people have ever seen
before, and soon it'll be available to anyone that's able to travel to
London. In humanity's quest to preserve historic spaces, LIDAR is
proving itself to be a valuable tool. The challenge now is to apply that
data in a way that benefits the upcoming museum and the stories its
curators want to tell.

[Image Credits: British Postal Museum & Archive/Miles Willis (Lead
photo, Mount Pleasant Mail Rail station photos); ScanLab Projects (Mail
Rail graphic and gallery)]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

the company that is doing this is in London
Scanlabs

http://scanlabprojects.co.uk/3dscanning

I don't know what free software tools are available, but this is an area
that needs full investigation.

Ruben


  1. 2015-06-01 Ruben Safir <mrbrklyn-at-panix.com> Re: [LIU Comp Sci] Summer NYLXS Study Schedule
  2. 2015-06-01 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Artifical Intelligence Workshop cancelled tonight for Space Program
  3. 2015-06-01 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Logicworks
  4. 2015-06-02 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] 3d scanning and virtualization
  5. 2015-06-04 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Linux Laptops cheap
  6. 2015-06-06 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Brooklyn Press
  7. 2015-06-07 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] This weeks schedule
  8. 2015-06-10 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Dynamic Network Stacks
  9. 2015-06-17 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Programming and Design contest
  10. 2015-06-18 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] join me for school with a scholarship from the Linux foundation
  11. 2015-06-26 Ruben Safir <mrbrklyn-at-panix.com> Subject: [LIU Comp Sci] Fwd: strange function types in unistd.h

NYLXS are Do'ers and the first step of Doing is Joining! Join NYLXS and make a difference in your community today!