Detection and Recognition of RGB-LED Based On Visible Light Communication Using Covoltional Neural Network
Detection and Recognition of RGB-LED Based On Visible Light Communication Using Covoltional Neural Network
sciences
Article
The Detection and Recognition of RGB-LED-ID
Based on Visible Light Communication using
Convolutional Neural Network
Weipeng Guan 1,2, *,† , Jingyi Li 1,† , Shangsheng Wen 3, *, Xinjie Zhang 4 , Yufeng Ye 4 ,
Jieheng Zheng 5 and Jiajia Jiang 4
1 School of Automation Science and Engineering, South China University of Technology, Guangzhou 510640,
China; [email protected]
2 SIAT-Sense Time Joint Lab, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences,
Shenzhen 518055, China
3 School of Materials Science and Engineering, South China University of Technology, Guangzhou 510640,
China
4 School of Electronic and Information Engineering, South China University of Technology,
Guangzhou 510640, China; [email protected] (X.Z.); [email protected] (Y.Y.);
[email protected] (J.J.)
5 School of Computer Science and Engineering, South China University of Technology, Guangzhou 510006,
China; [email protected]
* Correspondence: [email protected] (W.G.); [email protected] (S.W.)
† These authors contributed equally to this work, Weipeng Guan and Jingyi Li are co-first authors of
the article.
Received: 18 February 2019; Accepted: 27 March 2019; Published: 3 April 2019
Abstract: In this paper, an online to offline (O2O) method based on visible light communication
(VLC) is proposed, which is different from the traditional VLC with modulation and demodulation.
It is a new VLC with modulation and recognition. We use RGB light emitting diode (RGB-LED) as
the transmitter, and use Pulse Width Modulation (PWM) to modulate the signal to make it flicker
at high frequency. Therefore, several features are created. At the receiver, the complementary
metal-oxide-semiconductor (CMOS) image sensor is applied to our system to capture LED images
with stripes. A convolution neural network (CNN) is then introduced in our system as a classifier.
By offline training for the classifiers and online recognition of LED-ID, the scheme proposed could
improve the speed of LED-ID (the unique identification of each different LED) identification and
improve the robustness of the system. This is the first application of CNN in the field of VLC.
Keywords: visible light communication (VLC); online to offline (O2O); RGB-LED; convolution neural
network (CNN); complementary metal oxide semiconductor (CMOS)
1. Introduction
Visible light communication (VLC) has attracted much attention in recent years, because RGB-LED
communication devices (such as lamps, TVs, traffic signs) are used everywhere [1]. Moreover, VLC is
safer than radio frequency (RF), because of the line of sight (LOS) propagation and non-penetration
of light waves through opaque surfaces. Therefore, there is the hope that a powerful offline
to-online (O2O) and peer-to-peer (P2P) communication medium might be developed that receives
information from multiple RGB-LEDs. In this paper, we propose a new VLC scheme different from
previous modulation- and demodulation-based schemes to achieve higher accuracy and robustness
for recognition of RGB-LED-ID (the unique identification of each different LED).
There are two kinds of VLC: one uses photodiode (PD) as the receiver, and the other uses an image
sensor as the receiver. VLC based on image sensors has wider application value [2]. In this paper, we
propose an optical fringe code (OFC) detection and recognition based on visible light communication
using a convolutional neural network (CNN). We use RGB-LED as the transmitter, and it is derived by
pulse width modulation (PWM). At the receiver, a complementary metal-oxide-semiconductor (CMOS)
image sensor in a smartphone is used to capture the LED projection. LED projection is a form of stripe
image, because of the roll mechanism of CMOS sensors. A CNN is then used to build the classifier.
After offline training is completed, online recognition of OFC can be implemented. The experimental
results show that the scheme we propose can not only improve the distance and accuracy of OFC
recognition, but also improve the applicability and stability of OFC recognition. In addition, the
number of OFC is far more than that in our previous work in Reference [3]. Therefore, the OFC is
expected to become an effective complement to traditional 2D code scanning technology in the future.
Compared with the traditional modulation and demodulation method and our previous work,
the main advantage of the proposed method is practicability. There are a lot of application scenarios of
our system. For example, mobile payment is now a very popular technology, and OFC can be used as
an effective complementary means of mobile payment technology. In the future field of autopilot, OFC
can be used as an important means to realize navigation signs. It has the characteristic of luminescence,
and has higher robustness than single pattern recognition. Last but not least, in outdoor advertising
business, the use of OFC can provide new ideas for night advertising. Combined with virtual reality
technology, OFC can offer attractive potential to lighting advertising boards. Overall, it could be a
promising technology.
2. Related Work
Most of the early studies on VLC focused on increasing communication rate, signal-to-noise
ratio, etc. A barcode-based VLC system between smart devices was proposed in Reference [4], which
showed high level of security. Unfortunately, the error rate increased due to the “blooming” effect of
image sensors caused by uneven light output. In Reference [5], the authors found a way to avoid the
“blooming” effect. However, the effective transmission distance was too short (about 4 cm), and when
the distance was large, the data packet could be partially lost. There are a lot of similar works, such as
References [6–8].
These works have all been in terms of signal modulation and demodulation, and sought
improvement in signal encoding and decoding to transmit a large amount of data. In our previous
work [3], for the first time, we pushed VLC from “encoding + decoding” to “encoding + recognition.”
This shows a more powerful practicality than traditional VLC, but this method needs complicated
image processing and feature selection, which is not universal. What is more, the system only has 1035
unique LED-IDs under the same light intensity, which greatly restricts “encoding + recognition” VLC
wider application, and recognition rate is also limited by the accuracy of feature extraction.
3. Theory
Figure
Figure 1.1.Rolling
Rollingshutter
shuttermechanism
mechanismofofmetal-oxide-semiconductor
metal‐oxide‐semiconductor (CMOS)
(CMOS) sensor.
sensor.
Figure 1. Rolling shutter mechanism of metal‐oxide‐semiconductor (CMOS) sensor.
InInthis
thispaper,
paper, the
the PWM
PWM method
method is is employed.ItItisisa avery
employed. veryeffective
effectivetechnique
techniquetotocontrol
controlanalog
analog
In this paper, the PWM method is employed. It is a very effective technique to control analog
circuitsbyby
circuits thethe digital
digital transmission
transmission ofof microprocessors.
microprocessors. ByBy periodically
periodically opening
opening thethe LED
LED on/off,the
on/off, the
circuits by the digital transmission of microprocessors. By periodically opening the LED on/off, the
databitbit1/0
data 1/0isisencoded
encodedand andtransmitted.
transmitted.InInaddition,
addition,the
thefrequency,
frequency,coding
codingsequence,
sequence,and
andphase
phaseareare
data bit 1/0 is encoded and transmitted. In addition, the frequency, coding sequence, and phase are
considered
considered toto produce
produce OFC
OFC with
with different
different characteristics,
characteristics, which
which is is shown
shown in in Figures
Figures 2–4.
2–4.
considered to produce OFC with different characteristics, which is shown in Figures 2–4.
Figure 2. 2.The
Figure Theeffect ofof
effect different frequencies.
different frequencies.
Figure 2. The effect of different frequencies.
ByByincreasing
increasing the
thefrequency,
frequency, the width
the width ofofthe
thestripes
stripes will decrease
will decrease and
andthethenumber
number ofofstripes
stripeswill
will
By
increase.increasing
An the
example frequency,
is shown inthe width
Figure 2 of the
under stripes
the same will decrease
distance and the
condition. number of stripes will
increase. An example is shown in Figure 2 under the same distance condition.
increase. An example
InInaddition is shown inand Figure 2 under the samecolordistance condition.
additiontotothe
thenumber
number andwidth widthofofstripes,
stripes,the
the colorand andcolor
colorcycle
cycleofofstripes
stripescan canalso
alsobebe
In addition
changed by to the the
changing number
phase and widthLEDs
between of stripes,
of the color
different and An
colors. color cycle ofisstripes
example shown can
in also be3.
Figure
changed by changing the phase between LEDs of different colors. An example is shown in Figure 3.
changed
The by changing the phase between in LEDs of different ◦ , and
colors. An example ◦ .shown
75is75°. in Figurethat3.
Thephase
phasedifference
differenceofofthetheright
rightimage
image in Figure
Figure 3 is 1515°,
3 is and the left
the one
left oneis is It Itcan
canbebeseen
seen that
The phasephase
different difference of the can
differences right image
lead toto in Figure
different 3 is 15°,
colors of of
LEDandprojections.
the left one is 75°. It can be seen that
different phase differences can lead different colors LED projections.
different phase differences can lead to different colors of LED projections.
Appl. Sci. 2019, 9, x 4 of 13
Appl. Sci. 2019, 9, 1400 4 of 12
Appl. Sci. 2019, 9, x 4 of 13
Figure 5.
Figure The effect
5. The effect of
of the
the distance
distance between
between LED
LED and
and CMOS
CMOS camera.
camera.
(a)
(a)
(b)
Figure7.7.(a)
(a)System (b) (b) Experimental setup
Systemoverview.
overview.
Figure (b) Experimental setup.
Figure 7. (a) System overview. (b) Experimental setup
The
The key
key hardware
hardwareparameters
parametersare
areshown
shownin
inTable
Table1.1.
The key hardware parameters are shown in Table 1.
Table 1. Hardware parameters of the experiments.
Table 1. Hardware parameters of the experiments.
Parameter Value
The focal length/mm
Parameter 4.25
Value
TheThe
resolution of the camera
focal length/mm 4032 × 2448
4.25
TheThe
exposure timeof
resolution ofthe
thecamera
camera/ms 4032 0.1
× 2448
The ISO of the camera
The exposure time of the camera/ms 100
0.1
The
Theaperture of the
ISO of the camera
camera F1.7
100
Appl. Sci. 2019, 9, 1400 7 of 12
Parameter Value
The focal length/mm 4.25
Appl. Sci. 2019, 9, x The resolution of the camera 4032 × 2448 7 of 13
The exposure time of the camera/ms 0.1
TheThe
ISO aperture
of the camera
of the camera 100
F1.7
The aperture of the camera F1.7
The diameter
The diameter of the
of the LED LED downlight/cm
downlight/cm 66
The power
The power of each of each LED/W
LED/W 99
Current
Current of eachof each LED/mA
LED/mA 85
85
Voltage of each of
Voltage LED/V
each LED/V 55
Distance and
4.2. Distance and LED‐OFC
LED-OFC Recognition
Recognition Result Analysis
As mentioned
As mentioned before,
before, the
the change
change ofof distance
distance will
will lead
lead to
to the change
change of fringe number. What What is
more, the light
light intensity also decreases as the distance increases, which will have an impact onfinal
intensity also decreases as the distance increases, which will have an impact on the the
recognition
final accuracy.
recognition It is worth
accuracy. mentioning
It is worth that, for
mentioning each
that, forLED-OFC,
each LED‐OFC,the absolute width of
the absolute the stripes
width of the
at different
stripes distances
at different does notdoes
distances change, since thesince
not change, focalthe
length
focaldoes not does
length change notatchange
different
at distances.
different
An experiment
distances. is designedisto
An experiment explore to
designed theexplore
impactthe
of the distance
impact of theand other parameters
distance was conducted
and other parameters was
as follows: as
conducted Thefollows:
phase difference
The phasewas 0◦ , the encoding
difference was 0°, thesequence
encoding wassequence
“01”, thewasduty“01,”
ratiothe
wasduty
50%,ratio
and
different frequencies were used to generate different LED-OFC ID at each distance
was 50%, and different frequencies were used to generate different LED‐OFC ID at each distance (500 Hz–6500 Hz
with interval of 400
(500Hz–6500Hz withHz). We acquired
interval of 400Hz). about 3000 LED-OFC
We acquired images
about 3000 for training
LED‐OFC andfor
images 300 imagesand
training for
testing
300 at each
images for distance.
testing at The
eachexperiment
distance. The results are shown
experiment in Figure
results 8.
are shown in Figure 8.
Figure 8.
Figure Recognition accuracy
8. Recognition accuracy rates
rates at
at different
different distances.
distances.
The recognition
The recognition accuracy
accuracy fluctuated
fluctuated little
little at
at the
the initial
initial stage
stage until
until the
the distance
distance exceeded
exceeded 3.5 m,
3.5m,
and then
and then began
began toto decline
decline rapidly,
rapidly, asas seen
seen above.
above. This
This indicates
indicates that
that the
the proposed
proposed method
method can
can achieve
achieve
much greater distance than the traditional modulation and demodulation
much greater distance than the traditional modulation and demodulation method, which can onlymethod, which can only
achieve 10
achieve 10 cm
cm toto 50
50 cm
cm transmission
transmission distances.
distances. AsAs mentioned
mentioned before,
before, as
as the
the distance
distance increases,
increases, thethe
number of stripes decreases, which causes the transmitted data frame structure
number of stripes decreases, which causes the transmitted data frame structure to be lost, so the to be lost, so the
transmission distance is strictly limited. For example, the number of stripes cannot less
transmission distance is strictly limited. For example, the number of stripes cannot less than 8 if the than 8 if the
transmission encoding
transmission encoding sequence
sequence has has 88 bits,
bits, otherwise
otherwise aa complete
complete packet
packet cannot
cannot be
be received,
received, which
which
will make the transmitted data incorrectly demodulated. The above experiment result
will make the transmitted data incorrectly demodulated. The above experiment result proves that the proves that the
proposed method does not have this
proposed method does not have this limitation.limitation.
4.3.1.
4.3.1. Low
Low Frequency
Frequency Range
Range
As
As shown
shown inin Figure
Figure 9,
9, the
the recognition
recognition accuracy
accuracy rate
rate rapid
rapid declined
declined when
when the
the frequency
frequency dropped
dropped
below
below 3 Hz. Hz. When
Whenthe thefrequency
frequencywas was greater
greater than
than 5 Hz,
5 Hz, the the recognition
recognition accuracy
accuracy was was acceptable.
acceptable. That
That
is, to is, to ensure
ensure high recognition
high recognition accuracy,
accuracy, the frequency
the frequency resolutionresolution
should be should
greaterbethan
greater
5 Hzthan
when5 in
Hza
when in a low frequency
low frequency range. range.
Figure 9. The
Figure recognition
9. The accuracy
recognition raterate
accuracy at different frequency
at different resolutions
frequency in the
resolutions lowlow
in the resolution range.
resolution
range.
4.3.2. High Frequency Range
4.3.2.As
High Frequency
shown Range
in Figure 10, the minimum frequency resolution in the high frequency range was eight
times (40 Hz) that in the low frequency range as seen above, which also proves our previous analysis
As shown in Figure 10, the minimum frequency resolution in the high frequency range was eight
that, because of the excessively dense stripes of the LED-OFC, the minimum frequency resolution is
times (40 Hz) that in the low frequency range as seen above, which also proves our previous analysis
greater in the high frequency range.
that, because of the excessively dense stripes of the LED‐OFC, the minimum frequency resolution is
From the above experiments, the conclusion can be drawn that the minimum frequency resolution
greater in the high frequency range.
to ensure high recognition accuracy is 40 Hz. The totally different types of LED-OFC can be calculated
(10000−5000)
as follows: 40 + (50005−500) = 1025. Moreover, since the color features of the stripes, including
the color and the color change cycle, would change if the frequency of each LED were different, more
different LED-OFCs can be obtained; only the assumption that all the frequencies were the same was
considered in the above experiments. Therefore, there are about 1025 types of frequency. As far as
RGB-LED is concerned, since the frequencies of each LED are non-interfering, there will be three
different frequencies. According to the principle of permutation and combination in mathematics, the
total number should be: 109 (A31025 ).
Appl. Sci. 2019, 9, 1400 9 of 12
Appl. Sci. 2019, 9, x 9 of 13
Figure
Figure10.
10.The
Therecognition
recognitionaccuracy
accuracyrate
rate at
at different
different frequency
frequency resolution
resolution in high frequency range.
4.4. Duty
FromRatio
the and LED-OFC
above Recognition
experiments, the Result Analysis
conclusion can be drawn that the minimum frequency
resolution to ensure high recognition accuracy is 40 Hz. The totally
By changing the duty ratio, although the number, width, different
and color of thetypes
stripeofwill
LED‐OFC can
be changed,
be
thecalculated as follows:
light intensity will also be changed, which may 1025.
haveMoreover,
an impact since
on thethe color
final features of
recognition the
result.
In this experiment, we explored the effect of different duty ratios on the recognition
stripes, including the color and the color change cycle, would change if the frequency of each LED result. Other
parameters
were were more
different, used asdifferent
follows: LED‐OFCs
the frequency wasbe1000
can Hz, the phase
obtained; difference
only the was 0◦ ,that
assumption the distance
all the
was 20 cm, were
frequencies and the
theencoding
same wassequence
consideredwasin “01.” As above,
the above for eachTherefore,
experiments. duty ratio, 2000
there images
are about were
1025
acquired
types for
of2019,
Appl. Sci. training,
frequency.
9, x and 200 images for testing. The experiment result is shown in
As far as RGB‐LED is concerned, since the frequencies of each LED are Figure 11.10 non‐
of 13
interfering, there will be three different frequencies. According to the principle of permutation and
combination in mathematics, the total number should be: 109 (𝐴 ).
Figure 11.
Figure The recognition
11. The recognition accuracy
accuracy rate
rate at
at different
different duty
duty ratios.
ratios.
As seen
As seenabove,
above,the
therecognition
recognition accuracy
accuracy always
always maintained
maintained 100%
100% as duty
as the the duty
ratio ratio changes.
changes. This
indicates that the effect of duty ratio on the final recognition result is negligible. This also layslays
This indicates that the effect of duty ratio on the final recognition result is negligible. This also the
foundation for the subsequent exploration of the impact of the encoding sequence, as different
encoding sequences may result in different duty ratios, which means the experiment has multiple
variables. If the duty ratio is negligible, subsequent experiments will be able to explore the effect of
encoding sequences on recognition accuracy by controlling variables.
Appl. Sci. 2019, 9, 1400 10 of 12
the foundation for the subsequent exploration of the impact of the encoding sequence, as different
encoding sequences may result in different duty ratios, which means the experiment has multiple
variables. If the duty ratio is negligible, subsequent experiments will be able to explore the effect of
encoding sequences on recognition accuracy by controlling variables.
Figure 12.
Figure Therecognition
12. The recognition accuracy
accuracy rate
rate at
at different
different phase
phase difference
difference resolution.
resolution.
As we
As we can
can see,
see, the
the recognition
recognition accuracy
accuracy became
became stable
stable (above
(above 95%)
95%) when
when the
the phase
phase difference
difference
resolution was greater than 7.2 ◦ . This indicates that the minimum phase difference resolution to
resolution was greater than 7.2°. This indicates that the minimum phase difference resolution to
ensure high
ensure high recognition
recognition accuracy
accuracy is 7.2◦ , and
is 7.2°, and the
the total
total number
number ofof LED‐OFC
LED-OFC categories
categories that
that can
can be
be
360
distinguished is 50 ( = 50). As with the frequency resolution experiments above, the rough number
distinguished is 50 ( 7.2 50). As with the frequency resolution experiments above, the rough number
.
can be calculated as follows: There are 50 types of phase difference, as the result shows. There are two
can be calculated as follows: There are 50 types of phase difference, as the result shows. There are
non-interfering phase differences on the part of RGB-LED. There is a phase difference between red
two non‐interfering phase differences on the part of RGB‐LED. There is a phase difference between
LED and green LEDs, and a phase difference between green LED and blue LEDs. According to the
red LED and green LEDs, and a phase difference between green LED and3 blue LEDs. According to
principle of permutation and combination, the total number should be: 10 (A3250 ).
the principle of permutation and combination, the total number should be: 10 (𝐴 ).
4.6. Encoding Sequence and LED-OFC Recognition Result Analysis
4.6. Encoding Sequence and LED‐OFC Recognition Result Analysis
To explore the effect of encoding sequence on LED-OFC recognition, a six-digit encoding sequence
was To explore
used. the effect
In addition, of the
since encoding sequence
encoding sequenceon was
LED‐OFC recognition,
transmitted a six‐digit
cyclically, encoding
there were some
sequence was used. In addition, since the encoding sequence was transmitted cyclically,
sequences that are duplicated. Based on this, the non-repeating sequences were selected, which there were
is
some sequences that are duplicated. Based on this, the non‐repeating sequences were
shown in Table 2. The other parameters were as follows: the frequency was 1000 Hz, the phase selected, which
is shown inwas
difference Table 2. The
0◦ , and theother parameters
distance were
was 20 cm. aseach
For follows: the frequency
encoding sequence,was 1000
about 300Hz, the phase
images were
difference was 0°, and the distance was 20 cm. For each encoding sequence, about 300 images
acquired for training and 30 images for testing. The final experiment result is shown in Figure 13. were
acquired for training and 30 images for testing. The final experiment result is shown in Figure 13.
As seen above, the recognition accuracy was high enough; most sequences achieved a 100%
As seen above, the recognition accuracy was high enough; most sequences achieved a 100%
accuracy rate, and the recognition accuracy of all encoding sequences was above 95%. As for the
accuracy rate, and the recognition accuracy of all encoding sequences was above 95%. As for the
relatively lowlow
relatively accuracy
accuracy(96.43%
(96.43%and
and 95.83%), recognitionerrors
95.83%), recognition errorsmay
may have
have been been caused
caused by jitter
by jitter
during shooting or other reasons, but, in general, the recognition accuracy was completely
during shooting or other reasons, but, in general, the recognition accuracy was completely acceptable. acceptable.
Moreover,
Moreover,the the
length of of
length thethe
encoding
encodingsequence
sequence can
can bebechanged,
changed,andand encoding
encoding sequences
sequences of different
of different
lengths willwill
lengths notnot
produce
producethethesame
sameLED-OFC;
LED‐OFC; thisthis indicates
indicatesthat,
that,
inin theory,
theory, many
many different
different encoding
encoding
sequences
sequences (including
(including encodingsequence
encoding sequencelength
length and
and encoding
encodingsequence
sequence bits) cancan
bits) be used to generate
be used to generate
many many different
different LED‐OFCs
LED-OFCs which which
couldcould be recognized
be recognized with accuracy,
with high high accuracy,
and, inand, in theexperiment,
the above above
experiment, only a six‐digit encoding sequence was used
only a six-digit encoding sequence was used to prove the feasibility. to prove the feasibility.
5. Conclusion
5. Conclusions
In this paper, an OFC detection and recognition system based on VLC using CNN is proposed.
In this paper, an OFC detection and recognition system based on VLC using CNN is proposed.
Different from the traditional encoding and decoding methods, this paper transforms the problem
Different from the traditional encoding and decoding methods, this paper transforms the problem into
into an OFC recognition problem, which greatly simplifies the complexity of the VLC system.
an OFC recognition
Frequency, phase,problem,
encoding which greatly
sequence, simplifies
and distance the complexity
between receiver andoftransmitter
the VLC system. Frequency,
are considered
phase, encoding sequence, and distance between receiver and transmitter are considered
to produce various OFCs. A CNN structure with FCN style is applied. Once the OFC is recognized to produce
various OFCs. A CNN structure with FCN style
correctly, the related information can be obtained. is applied. Once the OFC is recognized correctly, the
related information can be
The results show obtained.
that the maximum recognizable distance and recognition accuracy were both
improved
The resultscompared withthe
show that themaximum
traditional 2D barcode recognition
recognizable distancemethod. In particular,
and recognition the proposed
accuracy were both
schemecompared
improved can identify approximately
with the traditional1200
2D(50barcode
× 2375 recognition
= 118,750) different
method. OFCs with high the
In particular, accuracy
proposed
(more than 95%), which is about 120 times that of Reference [3]. The effective recognition
scheme can identify approximately 1200 (50 × 2375 = 118,750) different OFCs with high accuracy (more of distance
is 3.5 m, which is suitable for most scenes. Therefore, the proposed scheme can complement the
traditional two‐dimensional bar code, and can be applied to various scenes.
However, there still exist some inevitable limitations to our proposed method. Firstly, the
performance of the scheme particularly relies on the restrictive condition that the camera and RGB‐
LEDs must be parallel to each other. Secondly, the recognition results will be imprecise when
detecting high speed objects. Thirdly, the cost of modifying RGB‐LEDs is too high for production.
Appl. Sci. 2019, 9, 1400 12 of 12
than 95%), which is about 120 times that of Reference [3]. The effective recognition of distance is 3.5 m,
which is suitable for most scenes. Therefore, the proposed scheme can complement the traditional
two-dimensional bar code, and can be applied to various scenes.
However, there still exist some inevitable limitations to our proposed method. Firstly, the
performance of the scheme particularly relies on the restrictive condition that the camera and
RGB-LEDs must be parallel to each other. Secondly, the recognition results will be imprecise when
detecting high speed objects. Thirdly, the cost of modifying RGB-LEDs is too high for production.
Although all the above disadvantages exist, they do not outweigh the innovation and improvement
of this paper upon previous methods. These drawbacks will also become our focus and direction in
future work.
Author Contributions: Conceptualization, W.G.; Data curation, X.Z., Y.Y., J.Z., J.J.; Investigation, J.L.;
Methodology, W.G.; Project administration, J.L. and S.W.; Resources, X.Z.; Software, X.Z. and J.Z.; Supervision,
S.W.; Visualization, J.Z. and X.Z.; Writing—original draft, W.G. and J.L.; Writing—review & editing, W.G. and J.L.
Funding: This research was funded by National Undergraduate Innovative and Entrepreneurial Training
Program grant number 201510561003, 201610561065, 201610561068, 201710561006, 201710561054, 201710561057,
201710561058, 201710561199, 201710561202, 201810561217, 201810561195, 201810561218, 201810561219, Special
Funds for the Cultivation of Guangdong College Students’ Scientific and Technological Innovation (“Climbing
Program” Special Funds) grant number pdjh2017b0040, pdjha0028, pdjh2019b0037 and Guangdong science and
technology project grant number 2017B010114001.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Jin, F.; Li, X.; Zhang, R.; Dong, C.; Hanzo, L. Resource Allocation Under Delay-Guarantee Constraints for
Visible-Light Communication. IEEE Access 2017, 4, 7301–7312. [CrossRef]
2. Fang, J.; Yang, Z.; Long, S.; Wu, Z.; Zhao, X.; Liang, F.; Jinag, Z.L.; Chen, Z. High-speed indoor navigation
system based on visible light and mobile phone. IEEE Photonics J. 2017, 9, 1–11. [CrossRef]
3. Xie, C.; Guan, W.; Wu, X.; Fang, L.; Cai, Y. The LED-ID Detection and Recognition Method based on Visible
Light Positioning using Proximity Method. IEEE Photonics J. 2018, 10, 1–16. [CrossRef]
4. Zhang, B.; Ren, K.; Xing, G.; Fu, X.; Wang, C. SBVLC: Secure barcode-based visible light communication for
smartphones. IEEE Trans. Mob. Comput. 2006, 15, 432–446. [CrossRef]
5. Chen, H.W.; Wen, S.S.; Liu, Y.; Fu, M.; Weng, Z.C.; Zhang, M. Optical camera communication for mobile
payments using an LED panel light. Appl. Opt. 2018, 57, 5288–5294. [CrossRef] [PubMed]
6. Zhang, X.; Gao, Q.; Gong, C.; Xu, Z. User Grouping and Power Allocation for NOMA Visible Light
Communication Multi-Cell Networks. IEEE Commun. Lett. 2017, 21, 777–780. [CrossRef]
7. Islim, M.S.; Ferreira, R.X.; He, X.; Xie, E.; Videv, S.; Viola, S.; Watson, S.; Bamiedakis, N.; Penty, R.V.;
White, I.H.; et al. Towards 10 Gb/s orthogonal frequency division multiplexing-based visible light
communication using a GaN violet micro-LED. Photonics Res. 2017, 5, A35–A43. [CrossRef]
8. Lin, B.; Tang, X.; Yang, H.; Ghassemlooy, Z.; Zhang, S.; Li, Y.; Lin, C. Experimental Demonstration of IFDMA
for Uplink Visible Light Communication. IEEE Photonics Technol. Lett. 2016, 28, 2218–2220. [CrossRef]
9. Chow, C.W.; Chen, C.Y.; Chen, S.H. Enhancement of Signal Performance in LED Visible Light
Communications Using Mobile Phone Camera. IEEE Photonics J. 2015, 7, 1–17. [CrossRef]
10. Liang, K.; Chow, C.W.; Liu, Y. RGB visible light communication using mobile phone camera and multi-input
multi-output. Opt. Express 2016, 24, 9383–9388. [CrossRef] [PubMed]
11. Hu, P.; Pathak, P.H.; Feng, X.; Fu, H.; Mohapatra, P. ColorBars: Increasing data rate of LED-to-camera
communication using color shift keying. In Proceedings of the 11th ACM Conference on Emerging
Networking Experiments and Technologies, Heidelberg, Germany, 1–4 December 2015; pp. 1–13.
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).