Japan's UAS Sensing Δ 27th of August 2018 Ω 10:13 AM

ΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞ
yourDragonXi~ Fuji Heavy Industries Ltd
yourDragonXi~ Kawada Industries
yourDragonXi~ Yamaha Motor Co.
yourDragonXi~ Sony
yourDragonXi~ DroneCloud
yourDragonXi~ ProDrone
yourDragonXi~ Terra-Drone
yourDragonXi~ Hamamatsu
yourDragonXi~ ZMP
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
yourDragonXi~ sense for Ξ
ξ
ξ
ξ
«UAS Sensing
Θ

Θ
ΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞΞ































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ Fuji Heavy Industries Ltd

»Fuji Heavy Industries Ltd



select: ~[Σ] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ Kawada Industries

»Kawada Industries



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ Yamaha Motor Co.

»Yamaha Motor Co.



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ Sony

Sony to begin developing drones
ξ »www.aerosense.co.jp
ξ holds the leading global market share for sensors that are used in digital cameras and other devices
ξ wants to expand the use of the technology to drones
ξ is considering drones equipped with these sensors to inspect infrastructure such as aging tunnels and bridges
ξ drones can also be used to check how agricultural crops are growing
ξ analysts say that the economic effect of drones will be over 76 billion dollars in the US alone by 2025
ξ the new project may help Sony rebuild its finances, as its television business has been in the red for the past 10 years

Other players
ξ IT industry to use drones for commercial purposes
ξ Amazon.com announced a plan to deliver goods using drones
ξ Facebook to use drones to beam Internet signals to various parts of the world as a way of expanding connectivity
ξ The firm also began the development of solar-powered drones that can fly for long periods



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ DroneCloud

»DroneCloud
ξ a cloud-based platform dedicated to project information and data management of drones
ξ project creation
ξ data management
ξ flight log management



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ ProDrone

»ProDrone



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ Terra-Drone

»Terra-Drone
ξ drone operator
ξ forest survey services with lidard equipped UAV
ξ Riegl and Velodyne scanners

Smart Drone” using “3D map” and “Drone Port”
ξ »www.kddi.com
Terra Drone and KDDI Corporation succeeded in a fully autonomous flight experiment of
“Smart Drone” using “3D map” and “Drone Port”
the world’s first long-distance drone flight of about 6.3 km via the “Drone Port”
enabled a drone to recharge automatically,
successfully returned to the landing site after spraying terraced ponds of Nishikigoi(carp) with pesticide.\n')

Terra Drone and KDDI aim to establish an infrastructure
that enables secure long-distance autonomous flight utilizing Smart Drone and the mobile communication network.\n')

This demonstration was an experiment for safe flight altitude
setting on the “3D map” and automatic charging by “drone port,”
which verified that the long-distance autonomous drone flight is technically possible.\n')

KDDI has partnered with Terra Drone and Zenrin,
a Japanese map publisher to jointly developed “Smart Drones Platform,”
which realizes safe drone flight using the mobile communication network and
3D map for autonomous drone flight and
set a secure flight altitude automatically.

3D map enables a drone to cognize altitude differences of topography
such as mountains and hills, buildings, and Terra Drone and
KDDI accomplished automatic discrimination of difference in elevation of more than 100m in this experiment.\n')

“Drone Port” developed by Prodrone Co., Ltd., industrial drone platform manufacturer,
has the automatic landing function based on image recognition
which enables long-distance flight via Drone Port.\n')

Yamakoshi City, Niigata Prefecture, Japan,
where the demonstration experiment conducted,
has one of the leading Nishikigoi pools in Japan.
To breed a beautiful and large Nishikigoi,
spraying all over the terraced ponds with pesticide in a boat is time-consuming and labor-intensive.

With Smart Drone capable of long-distance autonomous flight using “3D map” and mobile communication network,
you can automatically set altitude and apply effective pesticide spraying just by setting the location.

In the future, when Smart Long-distance autonomous flight infrastructure of smart drone
using this mobile communication network has established,
this platform will be active in the fields of not only agriculture
but also surveying topography and equipment, security of facilities,
disaster monitoring and delivery to remote areas.



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ Hamamatsu

»Hamamatsu

Lidar: A photonics guide to the autonomous vehicle market
Lidar vs. competing sensor technologies (camera, radar, and ultrasonic)
reinforces the need for sensor fusion,
as well as careful selection of photodetectors, light sources, and MEMS mirrors.

Advances in sensor technology, imaging, radar, light detection and ranging (lidar),
electronics, and artificial intelligence have enabled dozens of advanced driver assistance systems (ADAS),
including collision avoidance, blindspot monitoring, lane departure warning, or park assist.

Synchronizing the operation of such systems through sensor fusion
allows fully autonomous or self-driving vehicles
to monitor their surroundings and warn drivers of potential road hazards, or
even take evasive actions independent of the driver to prevent collision.

Autonomous vehicles must also differentiate and recognize objects ahead at high-speed conditions.
Using distance-gauging technology,
these self-driving cars must rapidly construct a three-dimensional (3D) map
up to a distance of about 100 m,
as well as create high-angular-resolution imagery at distances up to 250 m.

And if the driver is not present,
the artificial intelligence of the vehicle must make optimal decisions.

One of several basic approaches for this task
measures round-trip time of flight (ToF) of a pulse of energy
traveling from the autonomous vehicle to the target and back to the vehicle.
Distance to the reflection point can be calculated
when one knows the speed of the pulse through the air-a pulse
that can be ultrasound (sonar), radio wave (radar), or light (lidar).

Of these three ToF techniques,
lidar is the best choice to provide higher-angular-resolution imagery
because its smaller diffraction (beam divergence)
allows better recognition of adjacent objects compared to radar (see Fig. 1).
This higher-angular-resolution is especially important at high speed
to provide enough time to respond to a potential hazard such as head-on collision.

LASER SOURCE SELECTION
In ToF lidar, a laser emits a pulse of light of duration t
that activates the internal clock in a timing circuit at the instant of emission (see Fig. 2).
The reflected light pulse from the target reaches a photodetector,
producing an electrical output that deactivates the clock.
This electronically measured round-trip ToF ?t
allows calculation of the distance R to the reflection point.
If the laser and photodetector are practically at the same location, the distance is given by


where c is the speed of light in vacuum and
n is the index of refraction of the propagation medium (for air, approximately 1).

Two factors affect the distance resolution ?R:
the uncertainty d?t in measuring ?t and
the spatial width w of the pulse (w = ct),
if the diameter of the laser spot is larger than the size of the target feature to be resolved.

The first factor implies ?R = ˝ cd?t,
whereas the second implies ?R = ˝ w = ˝ ct.
If the distance is to be measured with a resolution of 5 cm,
the above relations separately imply
that d?t is approximately 300 ps and t is approximately 300 ps.
Time-of-flight lidar requires photodetectors and detection electronics
with small time jitter (the main contributor to d?t) and
lasers capable of emitting short-duration pulses,
such as relatively expensive picosecond lasers.
A laser in a typical automotive lidar system produces pulses of about 4 ns duration,
so minimal beam divergence is essential.

One of the most critical choices for automotive lidar system designers
is the light wavelength.
Several factors constrain this choice:
safety to human vision, interaction with the atmosphere,
availability of lasers, and availability of photodetectors.

The two most popular wavelengths are 905 and 1550 nm,
with the primary advantage of 905 nm
being that silicon absorbs photons at this wavelength and
silicon-based photodetectors are generally less expensive
than the indium gallium arsenide (InGaAs) infrared (IR) photodetectors
needed to detect 1550 nm light.

However, the higher human-vision safety of 1550 nm
allows the use of lasers with a larger radiant energy per pulse-an important factor in the photon budget.

Atmospheric attenuation (under all weather conditions),
scattering from airborne particles, and reflectance from target surfaces are wavelength-dependent.
This is a complex issue for automotive lidar
because of the myriad of possible weather conditions and types of reflecting surfaces.
Under most realistic settings, loss of light at 905 nm is less
because water absorption is stronger at 1550 nm than at 905 nm.1

PHOTON DETECTION OPTIONS
Only a small fraction of photons emitted in a pulse
ever reach the active area of the photodetector.
If the atmospheric attenuation does not vary along the pulse's path,
the beam divergence of the laser light is negligible,
the illumination spot is smaller than the target,
the angle of incidence is zero, and
the reflection is Lambertian, then the optical received peak power P(R) is:

where P0 is the optical peak power of the emitted laser pulse,
? is the reflectivity of the target,
A0 is the receiver's aperture area,
?0 is the detection optics' spectral transmission, and
? is the atmospheric extinction coefficient.

This equation shows that the received power rapidly decreases with increasing distance R.
For a reasonable choice of the parameters and R=100 m,
the number of returning photons on the detector's active area is on the order of
a few hundred to a few thousand from the more than 1020 typically emitted.
These photons compete for detection with background photons carrying no useful information.

Using a narrowband filter can reduce the amount of background light reaching the detector,
but the amount cannot be reduced to zero.
The effect of the background is the reduction of the detection dynamic range and
higher noise (background photon shot noise).
It's noteworthy that the terrestrial solar irradiance under typical conditions is less at 1550 nm than at 905 nm.

Creating a 3D map in a full 360° × 20° strip surrounding a car
requires a raster-scanned laser beam or multiple beams, or
flooding the scene with light and gathering a point cloud of data returns.

The former approach is known as scanning lidar and
the latter as flash lidar.

There are several approaches to scanning lidar.
In the first, exemplified by Velodyne (San Jose, CA),
the roof-mounted lidar platform rotates at 300-900 rpm
while emitting pulses from 64 905 nm laser diodes.
Each beam has a dedicated avalanche photodiode (APD) detector.

A similar approach uses a rotating multi-faceted mirror
with each facet at a slightly different tilt angle
to steer a single beam of pulses in different azimuthal and declinational angles.
The moving parts in both designs represent a failure risk in mechanically rough driving environments.

The second, more compact approach to scanning lidar
uses a tiny microelectromechanical systems (MEMS) mirror
to electrically steer a beam or beams in a 2D orientation.
Although technically there are still moving parts (oscillating mirrors),
the amplitude of the oscillation is small and
the frequency is high enough to prevent mechanical resonances between the MEMS mirror and the car.
However, the confined geometry of the mirror constrains its oscillation amplitude,
which translates into limited field of view-a disadvantage of this MEMS approach.
Nevertheless, this method is gaining interest because of its low cost and proven technology.

Optical phased array (OPA) technology,
the third competing scanning lidar technique,
is gaining popularity for its reliable, "no-moving-parts" design.
It consists of arrays of optical antenna elements
that are equally illuminated by coherent light.
Beam steering is achieved by independently controlling the phase and
amplitude of the re-emitted light by each element, and
far-field interference produces a desired illumination pattern
from a single beam to multiple beams.
Unfortunately, light loss in the various OPA components restricts the usable range.

Flash lidar floods the scene with light,
though the illumination region matches the field of view of the detector.
The detector is an array of APDs at the focal plane of the detection optics.
Each APD independently measures ToF to the target feature imaged on that APD.
This is a truly "no-moving-parts" approach
where the tangential resolution is limited by the pixel size of the 2D detector.
The major disadvantage of flash lidar, however, is photon budget:
once the distance is more than a few tens of meters,
the amount of returning light is too small for reliable detection.
The budget can be improved at the expense of tangential resolution
if instead of flooding the scene with photons, structured light-a grid of points-illuminates it.
Vertical-cavity surface-emitting lasers (VCSELs)
make it possible to create projectors emitting thousands of beams simultaneously in different directions.

BEYOND TIME-OF-FLIGHT LIMITATIONS

Time-of-flight lidar is susceptible to noise
because of the weakness of the returned pulses and
wide bandwidth of the detection electronics, and
threshold triggering can produce errors in measurement of ?t.
For these reasons, frequency-modulated continuous-wave (FMCW) lidar is an interesting alternative.

In FMCW radar, or chirped radar,
the antenna continuously emits radio waves whose frequency is modulated-for example,
linearly increasing from f0 to fmax over time T and
then linearly decreasing from fmax to f0 over time T.
If the wave reflects from a moving object at some distance and
comes back to the emission point, its instantaneous frequency will differ from the one being emitted at that instant.
The difference is because of two factors:
the distance to the object and its relative radial velocity.
One can electronically measure the frequency difference and
simultaneously calculate the object's distance and velocity (see Fig. 3).

Inspired by chirped radar, FMCW lidar can be approached in different ways.
In the simplest design, one can chirp-modulate the intensity of the beam of light that illuminates the target.
This frequency is subject to the same laws (such as the Doppler effect)
as the carrier frequency in FMCW radar.
The returned light is detected by a photodetector to recover modulation frequency.
The output is amplified and mixed in with the local oscillator
allowing the measurement of frequency shift and from that,
the calculation of distance to and speed of the target.

But FMCW lidar has some limitations.
Compared to a ToF lidar, it requires more computational power and,
therefore, is slower in generating a full 3D surround view.
In addition, the accuracy of the measurements is very sensitive to linearity of the chirp ramp.
Although designing a functional lidar system is challenging,
none of these challenges are insurmountable.
As the research continues,
we are getting closer to the time when the majority of cars driving off the assembly line will be fully autonomous.

REFERENCE
1. J. Wojtanowski et al., Opto-Electron. Rev., 22, 3, 183-190 (2014).

Slawomir Piatek is a research scientist and
Jake Li is a marketing engineer, both at Hamamatsu, Bridgewater, NJ;
e-mail: jli@hamamatsu.com; www.hamamatsu.com.



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~ ZMP

»www.zmp.co.jp
ξ autonomous taxi services for Olympics 2020
ξ joint venture with Aerosense by Sony



select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω] ~[Δ]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
yourDragonXi ~





select: ~[Σ] ~[Ω]!































































~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Small & Smart Inc reserves rights to change this document without any notice
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~