A New Method For Visualizing and Evaluating Views in Architectural Design
A New Method For Visualizing and Evaluating Views in Architectural Design
A R T I C L E I N F O A B S T R A C T
Keywords: In buildings, good views to the exterior can provide positive psychological benefits and are often one of the most
Window design tool appreciated aspects of an interior space. However, the existing architectural design workflow fails to take view
Fenestration into consideration during the early design stages. This research provides two innovative outputs: first, a method to
View visualization
visualize and quantitatively evaluate real, photographic views from existing or prospective buildings and, second,
Architecture metric
Parametric design workflow
a window design workflow to help architects evaluate view simultaneously along with the results of daylight and
energy simulations. A “View Score” is calculated, in part, via a weighted average of view area and user’s view
preferences. The method could also help real estate professionals to show or valuate views for prospective buyers
of unbuilt units. The application of this workflow is tested in one case study in Cambridge, MA.
1. Introduction on two sample view images, not the actual views. The visualization and
evaluation of prospective views are not included in these existing tools.
Window design is a crucial aspect of both architectural design and Researchers have developed methods to visualize views through hand
building performance optimization. Moreover, view is one of the most drawn projection, a computer model, or a camera with fisheye lens
appreciated functions of a window in architectural design. Research has (Hellinga and Hordijk, 2014; Fontenelle and Ramalho, 2014). Recent
linked views, and in particular, views of nature or other preferred views, research has developed frameworks to evaluate view access through
with improved worker productivity (Leather et al. 1998) as well as vector raytracing methods in 3D models (Mardaljevic, 2019; Turan et al.,
positive psychological (Moore, 1981; Kaplan, 2001; Aries et al., 2010) 2019). But these methods require the user to model the surrounding
and health (Verderber, 1986; Raanaas et al., 2012) benefits for building building or visit a target site. Moreover, the resulting image is only as
occupants. For example, in a well-cited study in Science (Ulrich, 1984) detailed as the drawn or modeled input, which often is rudimentary.
correlated views of nature to reduced surgery recovery times. View also Researchers have developed view evaluation methods based on ques-
plays a significant role in the market price of real estate since people are tionnaire studies (Hellinga and Hordijk, 2014; Fontenelle and Ramalho,
often willing to pay a premium for attractive views (Damigos and Any- 2014), but existing evaluation methods only evaluate views through the
fantis, 2011; Baranzini and Schaerer, 2011). photographs taken from the already-built windows and therefore fail to
A few existing web tools can assist early-stage window design, such as help inform prospective window design options.
Façade Design Tool, the MIT Design Advisor, COMFEN and PROSOLIS This research develops a framework to visualize and evaluate real
(Efficient Efficient Windows Collaborative, 2011; Glicksman et al. 2009; views from existing or prospective window designs with less time and
Dartevelle et al., 2014; Berkeley Lab n.d.). Researchers have also devel- effort, as well as more realism. This research also proposes a workflow to
oped frameworks to consider multiple criteria like daylight, view, and visualize a large set of design options, which allows architects to see the
energy in the window design process (Reinhart and Jan, 2011; Hellinga performance trends of view, daylight, and energy in their designs to make
and Hordijk, 2014; Fontenelle and Ramalho, 2014). Yet, the importance more informed decisions. This research will ultimately help architects to
of view (i.e. the exterior context that occupants can see from indoors) to balance multiple factors holistically, in contrast to typical simulation
fenestration design has been overlooked in existing architectural design which tends to focus on a single factor at a time.
tools. At the time of writing, the authors could only find one available
tool to help designers evaluate views in the window design process — the 2. Methods
PROSOLIS web-based tool (Dartevelle et al., 2014). However, this tool
only provides view-through images for different shading systems based This paper proposes a new fenestration design workflow that can
* Corresponding author.
E-mail addresses: [email protected] (W. Li), [email protected] (H. Samuelson).
https://doi.org/10.1016/j.dibe.2020.100005
Received 25 October 2019; Received in revised form 13 December 2019; Accepted 15 January 2020
Available online 30 January 2020
2666-1659/© 2020 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-
nc-nd/4.0/).
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
accomplish three things. It can visualize views for different window de-
signs, quantitatively evaluate views according to users’ preferences, and
integrate view as an input in the current architectural design process
(including daylight and energy simulation, for example).
Rather than investing time and effort into building a detailed 3D site
model, the proposed method uses model and image data from a public
database to realistically simulate outdoor views. The following section
explains this method.
Fig. 1. View visualization method based on similar triangles. Fig. 3. View visualization for window geometry.
2
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
FOVH2 ¼ 2 tan-1 [W/ (L0 þ2L1)] (4) calculate the view range above and below the eye level:
ðhe h1 Þ
β3 ¼ tan1 (12)
L2
When the eye level is between the windowsill and window head,
examples V1, V2, and V3. The vertical field of view can then be calcu-
lated through Equation (13):
Fig. 4. Horizontal view range from different viewpoints in the plan view. Fig. 6. Vertical view range from different viewpoints in the section view.
3
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
When the eye level is below the windowsill, example 4, the vertical
field of view can then be calculated through Equation (14):
FOVv4 ¼ α4 β4 (14)
After requesting the source image and calculating the view ranges,
the final step for view visualization is to project the view ranges back to
the source image. The authors propose using a 360 unobstructed
panoramic image of prospective views as the base for view visualization.
In Fig. 7, the authors generate a panoramic image from six images
exported from Google Earth Studio, each with a FOV of 60 , an angle
chosen due to its similarity with human vision (Spector, 1990; Dahnoun,
2017). The panoramic image generated by this method is accurate for
new construction, where no building exists at the target viewpoint. If
there is an existing building at the target viewpoint, the viewpoint can
only be positioned at the perimeter edge or the top of the existing
building to avoid obstruction from the building itself.
The next step is overlapping window geometries onto the panoramic
image. Since the FOV used to generate the panoramic image is 60 ,
Equation (15) can be used to calculate the real size of the panoramic
image (Fig. 8):
Fig. 9. Shoebox model for the view range script test.
Wp ¼ (W FOVe)/FOV (15)
is at 101 m, which is the altitude of the sitting viewpoint. Then the au-
Where Wp – Width of the exported image (m);
thors exported four images with a FOV of 90 at this location/elevation to
generate the backgrounds panoramic view image, using the same method
FOV – Horizontal field of View ( );
as in Fig. 7.
FOVe – FOV used to export image ( );
Using the proposed view visualization Grasshopper script, the user
W – Window width (m);
can now visualize the view range in Rhino. Fig. 10 illustrates the results
including four view ranges which represent the minimal view range,
The authors embedded the proposed calculation methods into a
which is from the viewpoint at the back of the room (H1); the medial
Grasshopper script. Grasshopper (McNeel, 2019b; 2019a) is a computer
view range, which is seen from the viewpoint in the middle of the room
scripting plug-in for the architectural 3D modeling software Rhinoceros.
The inputs for the script include window orientation, room length (L0),
window width (W), window height (H), windowsill height (H1), eye level
height (He), vertical shading depth (L1), and horizontal shading depth
(L2). The outputs are rectangles superimposed on the panoramic image
that indicate the horizontal and vertical view range.
As a case study, the authors designed a shoebox model facing east
(Fig. 9), located at the southeast corner of Central Park in New York
(latitude 40.766 , longitude 73.969 , altitude 100 m). The window sill
height is 1.0 m and the window height is 1.4 m. The average human eye
height is 1.567 m for standing position and 1.171 m for sitting position
(NASA, 1995). These heights are with respect to the floor.
The authors then considered the case of an upper story by adding the
floor altitude (height above sea level) to the eye level height. In this case,
the altitude of the source images exported from the Google Earth Studio Fig. 10. View range results for the shoebox model.
4
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
(H2); the front view range, which is seen from the viewpoint at the front
of the room (H3) (all aligned with the window midpoint), and the
maximal view range (Hmax), which is an amalgamation of the two
viewpoints at the horizontal window edges.
Fig. 11. User-defined view preference areas. (Image shows results for square
windows facing four possible orientations, from the mid-room viewpoint).
5
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
3. Case study
The proposed workflow was tested in two case studies: one office
window to validate the proposed view visualization method; and a
parametric shoebox model to test the integrated window design work-
flow which includes three performance metrics—view, daylight, and
energy.
To validate the proposed view visualization method, the authors Fig. 15. Validation result of view visualization method.
compare photographs taken through an office window to the output from
the proposed view visualization tool. The authors first measured the di-
Table 1
mensions of the office and the altitude of the test floor. Fig. 14 illustrates
Time spent using the proposed view visualization method.
the dimensions of the office. Then the authors input this information
along with the coordinate of the office (71.114 , 42.376 ) and the Step Description Recorded time
altitude of the floor above sea level (40.8m) into the view visualization 1 Getting coordinates and viewpoint height 3 min
tool. The proposed tool will then generate the view range at the same 2 Exporting source images from Google Earth Studio 10 min
location. The authors then took a picture parallel to the window from a 3 Using Grasshopper script to visualize view 12 min
Total time 25 min
sitting position (eye level height: 1.2m) and compared it to the output
from the proposed tool.
Fig. 15 shows the output from the proposed view visualization tool
and the photograph taken at the same place. The main building in the
view matches well with the photograph, the tree and sky in the photo-
graph are also included in the view range output. Hence this case study
helps validate the proposed view visualization method.
The authors compared the time used for the proposed method and an
example traditional modeling method to visualize the view from the
same office. Table 1 demonstrates the time spent for each step, using the
proposed method for view visualization, and the total time is 25 minutes.
The authors then recorded the time spent generating a comparable
view image through 3ds max — a modeling and rendering software
(Autodesk and max.). The model was built by an expert 3D renderer with Fig. 16. View visualization result through 3D modeling.
an architecture degree and more than two years’ experience with the
software. Fig. 16 illustrates the view visualization result through 3ds
max. Table 2 lists the description and time spent on each step using the Table 2
modeling and rendering method for view visualization, and the total time Time record for modeling and rendering method.
is 210 minutes. Step Description Recorded
According to this example case, the proposed method only used 12% time
of the time needed for the more traditional modeling and rendering 1 Modeling the building context, adding a sky image, and 90 min
method. The time difference might be even larger if the site is located in a adjusting basic render settings
high-density neighborhood, in which case the modeling time will be 2 Modeling facade details of neighboring building, adding 120 min
longer while the time used through the proposed method will stay the trees and rendering
Total time 210 min
same since no modeling is required in the proposed method. This is only
one example, and task times for the traditional method could vary greatly
depending on the user, their workflow, and the urban context. Never-
theless, the authors believe that the proposed method is much more
efficient than the existing architectural design methods of estimating
view in most new buildings.
6
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
south-facing scenarios. beta phase. The authors proposed a method to match the field of view
The authors then calculated the view preference score according to from a target viewpoint to the camera setting in Google Earth Studio.
the proposed methods. Fig. 19 displays the view preference scores for the Using this method, users can visualize prospective views at any location
10 south-facing scenarios. The authors also conducted view access where a three-dimension model exists in Google’s database.
analysis, daylight analysis, and energy simulations for all of the 160 Compared with existing view visualization methods that require the
design scenarios for the final visualization. The daylight and energy user to model the outdoor context in 3D, the proposed view visualization
simulation results in this paper are produced by existing tools, specif- method, using Google’s existing model database, is much more efficient.
ically, Ladybug and Honeybee (an interface for the EnergyPlus engine) Moreover, modelers often find it challenging to include detailed context
respectively (Ladybug Tools, 2018; Department of EnergyNational information in a site model, hence most view images exported from a site
Renewable Energy Laboratory, 2018). model are very rudimentary. Here, the proposed view visualization
After all the simulations were run, the authors uploaded the results to method includes full context information, which includes trees, water,
the Design Explorer interface. To better visualize the overall perfor- sky, and other elements from Google’s database, allowing for more
mance, the authors also generated a spider diagram for each design realistic-looking results.
scenario. Secondly, this research proposed a new metric “View Score” to
Fig. 20 illustrates a process in which the user narrowed down 160 evaluate view quality, using areas and weighting factors to quantify user-
design scenarios to the two options that meet both the view preference of defined desirable views on a 360 panoramic image of prospective views.
70 and a maximum EUI target of 155 kWh/m2. Fig. 21 displays the two One possible application of this new view evaluation method is in
design options. building design, where an architect can use the View Score to decide the
window location and geometry in order to obtain a better view.
4. Discussion For the proposed View Score, the user can further calculate the view
scores from multiple occupant viewpoints (e.g. using the five viewpoint
This research has established a new method for view visualization, method described above or a grid of nodes). One could even visualize the
proposed a more quantifiable metric —View Score to evaluate view View Score spatially with a false-color plan overlay, similar to many
quality, and developed a window design workflow to include view in the daylighting metrics, such as "daylight autonomy”, and then calculate an
existing window design process. The following sections demonstrate the aggregate score, similar to “spatial daylight autonomy” for the room
main innovations and limitations of this research. (Reinhart and Jan, 2011).
Quantifying the View Score may not be necessary if a designer only
wishes to compare a few options. However, this automated step would
4.1. Contributions
allow users to easily compare numerous design options and to combine
these results with other outputs, such as daylighting and energy simu-
View is an important attribute of window design but is currently
lation results.
difficult to consider in the fast-phased early architectural design process.
Finally, the proposed workflow makes it possible to consider view as an
The proposed view visualization method can help architects include view
important factor in the overall window performance. The quantitative View
in the building design process with more detailed information and less
Score allows this metric to be evaluated along with other quantitative re-
modeling effort. This research started with an innovative way to visualize
sults such as simulated daylighting and energy metrics. The proposed new
prospective views through Google Earth Studio — a software still in the
7
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
workflow helps the user find desirable window design scenarios among dynamic view visualization method, and developing a more integrated
numerous options in a pre-simulated pool. Importantly, in the authors’ platform.
opinion, the value of the pool of numerous options is not in finding the one The source image used in this paper is from Google Earth Studio, a
optimal solution, but rather in allowing the designer to observe trends in the tool still in the beta phase and not released to the public yet. Although the
results. For example, a visual ranking of multiple results may show that for authors performed a validation for the credibility of the source image,
window orientation options within 15 degrees of south, the orientation has future work can consider a more detailed validation. Moreover, since
a more dramatic impact on view than energy performance, or vice versa. By Google is still building its database, some cities might not have the full 3D
allowing the designer to observe trends like these, they can use the infor- model yet, which restricts the use of the proposed view visualization
mation, combined with all their other design criteria not included in the methods.
simulations, to make more informed design decisions. For the view visualization method, the authors used different view-
Beyond building design, the view visualization method could also points to generate view ranges in a room. This method is based on the
provide value to real estate brokers for showing prospective views to average and extreme cases. Future work could develop a viewpoint grid,
potential buyers. Whereas generating view quality scores of different and extend the proposed method for dynamic view visualization, so the
units within a building could help brokers adjust the price of each unit user can visualize the view from a number of points in the room and a
based on the quality of views. variety of angles. In this way, the view visualization process can be more
dynamic and the View Score could be calculated and aggregated for the
whole room. The described method does not account for inside objects
4.2. Limitation and future work that block a portion of the window. However, users can model their
interior geometries and add these objects to the visualization in the same
Future goals for the proposed workflow are described below. They way that an architect would model a building’s interior today put simply,
include obtaining full access to existing software, building a more
8
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
Fig. 20. Ranking of all design options meet EUI and view preference target.
Fig. 21. Design options that meet both EUI and view preference targets.
the interior portion of the architectural modeling process remains un- person will also influence the perception of view.
changed from traditional methods. A person’s perception of view can also be different at different times.
The proposed window design workflow can be further combined into In theory, Google Earth Studio also provides time-of-day, date, and cloud
one integrated platform, using one interface instead of multiple. An on- attributes, so the user could export a photo at a specific time (Fig. 22).
line platform could even encourage users to upload their results, even- The time-of-day and date can help the user visualize sun locations.
tually providing data for designers and researchers. The spider diagram is However, in our tests, the images did not change much by adding cloud
easy to read on the online interface and can be collected to build a result attributes, or changing the dates to a different season. This might be
profile for different design scenarios. because Google’s airplane only took photos on clear days in the spring
The view image generated through the proposed method is a virtually season (Dennis, 2017). Future work can integrate new photo editing al-
constructed view based on a still person’s visual field. However, the real gorithms to change the weather and seasons (Laffont et al. 2014).
perception of view might differ since humans have a larger peripheral
visual field laterally and downward (Spector, 1990). The movement of a 4.3. Outlook
9
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005
holistically consider trade-offs of multiple factors simultaneously, Damigos, Dimitris, Anyfantis, Fotis, 2011. The value of view through the eyes of real
estate experts: a fuzzy delphi approach. Landsc. Urban Plann. 101 (2), 171–178.
including the important consideration of view. Since designers have
https://doi.org/10.1016/j.landurbplan.2011.02.009.
multiple goals to achieve in the process of window design, this workflow Dartevelle, Olivier, Arnaud, Deneyer, Bodart, Magali, 2014. PROSOLIS: a web tool for
makes it easier to see the trends in window performance. The ultimate thermal and daylight characteristics comparison of glazing complexes. https://doi.
goal of such research is to use simulation tools with both quantitative and org/10.13140/2.1.1925.1522.
Dennis, Natalie, 2017. “Google earth’s incredible 3D imagery, explained. April 18. htt
qualitative factors to augment human design. ps://blog.google/products/earth/google-earths-incredible-3d-imagery-explained/,
2017.
Department of Energy, National Renewable Energy Laboratory, Construction Engineering
5. Conclusion Laboratory, University of Illinois, 2018. EnergyPlus, 2018. https://energyplus.net/.
Efficient Efficient Windows Collaborative, 2011. “Facade design tool.” windows for high-
This paper proposes a multi-factor window design workflow to help performance commercial buildings, 2011. https://www.commercialwindows.or
g/fdt.php.
architects visualize view, daylight, and energy consumption simulta- Fontenelle, Ramalho, Marília, Leopoldo Eurico Gonçalves Bastos, 2014. The multicriteria
neously, while also providing a method to visualize and evaluate pro- approach in the architecture conception: defining windows for an office building in
spective views. Compared with existing view visualization methods, the rio de Janeiro. Build. Environ. 74 (April), 96–105. https://doi.org/10.1016/
j.buildenv.2014.01.005.
proposed method not only requires less time for preparation but also can
Glicksman, Leon, Urban, Bryan, Ali, Zehra, Lehar, Matthew, Gouldstone, James,
generate more accurate and detailed results. A window design workflow Ray, Steve, 2009. The MIT Design Advisor, 2009. http://designadvisor.mit.edu/desi
based on Rhino and Grasshopper was also proposed, which enables an gn/.
interactive comparison of numerous design options. Ultimately, this Google Inc. Google Maps. Accessed December 7, 2019b.
https://www.google.com/maps/@40.7435462,-73.9712484,15z.
research provides a tool that can help architects make more informed Google, L.L.C., 2019. “Google Earth studio.” Google Earth studio. January 21, 2019. htt
window design decisions. ps://www.google.com/earth/studio/.
Google LLC. Google Earth. Accessed December 8, 2019a. https://earth.google.com/web/.
Hellinga, Hester, Hordijk, Truus, 2014. The D&V analysis method: a method for the
Author contributions analysis of daylight access and view quality. Build. Environ. 79 (September),
101–114. https://doi.org/10.1016/j.buildenv.2014.04.032.
Kaplan, Rachel, 2001. The nature of the view from home: psychological benefits. Environ.
Wenting Li: conceptualization, methodology, software, data cura-
Behav. 33 (4), 507–542. https://doi.org/10.1177/00139160121973115.
tion, writing—original draft preparation, writing—review and editing Ladybug Tools, 2018. “Ladybug.” ladybug tools, 2018. https://www.ladybug.tools/ab
Holly W. Samuelson:conceptualization, methodology, writing—re- out.html.
Laffont, Pierre-Yves, Ren, Zhile, Tao, Xiaofeng, Qian, Chao, Hays, James, 2014. Transient
view and editing, funding acquisition, supervision
attributes for high-level understanding and editing of outdoor scenes. ACM Trans.
Graph. 33 (4), 1–11. https://doi.org/10.1145/2601097.2601101.
Leather, Phil, Pyrgas, Mike, Di Beale, Lawrence, Claire, 1998. Windows in the workplace:
Declaration of competing interest sunlight, view, and occupational stress. Environ. Behav. 30 (6), 739–762.
Liang, Jianming, Gong, Jianhua, Li, Wenhang, 2018. “Applications and impacts of Google
Earth: a decadal review (2006–2016). ISPRS J. Photogrammetry Remote Sens. 146
The authors declare that they have no known competing financial (December), 91–107. https://doi.org/10.1016/j.isprsjprs.2018.08.019.
interests or personal relationships that could have appeared to influence Liu, Lingyun, Stamos, Ioannis, 2012. A systematic approach for 2D-image to 3D-range
the work reported in this paper. registration in urban environments. Comput. Vis. Image Understand. 116 (1), 25–37.
Mardaljevic, John, 2019. “Aperture-Based daylight modelling: introducing the `View
lumen’. In: Proceedings Of the 16th IBPSA International Conference And Exhibition.
Acknowledgements McNeel, 2019a. Grasshopper. Rhinoceros 23, 2019. September. https://www.rhino3d.co
m/6/new/grasshopper.
McNeel, 2019b. Rhino (version 6). Rhinoceros 23, 2019. September. https://www.rhi
This research was partially funded by the Harvard Graduate School of no3d.com/.
Design. The authors would like to thank Ilia Yazdanpanah for his help with Moore, Ernest O., 1981. “A prison environment’s effect on health care service demands.
J. Environ. Syst. 11 (1), 17–34. https://doi.org/10.2190/KM50-WH2K-K2D1-DM69.
the literature review on the human benefits of views and the modeling of the
NASA, July 1995. “Anthropometry and biomechanics.” man-systems integration
case study for the workflow comparison. The authors are not affiliated with standards. https://msis.jsc.nasa.gov/sections/section03.htm.
any software provider. Raanaas, Ruth, Kjaersti, Grete Grindal, Patil, Terry, Hartig, 2012. Health benefits of a
view of nature through the window: a quasi-experimental study of patients in a
residential rehabilitation center. Clin. Rehabil. 26 (1), 21–32 https://doi.org.ezp-
References prod1.hul.harvard.edu/10.1177/0269215511412800.
Reinhart, Christoph, Jan, Wienold, 2011. “The daylighting dashboard – a simulation-
Apple, 2017. “Cameras.” IOS device compatibility reference, 2017. https:// based design analysis for daylit spaces. Build. Environ. 46 (2), 386–396. https://
developer.apple.com/library/archive/documentation/DeviceInformation/Ref doi.org/10.1016/j.buildenv.2010.08.001.
erence/iOSDeviceCompatibility/Cameras/Cameras.html#//apple_ref/doc/uid/T Solemma, 2018. DIVA-for-Rhino (version 4), 2018. https://www.solemma.com/Diva.h
P40013599-CH107-SW1. tml.
Aries, Myriam B.C., Veitch, Jennifer A., Newsham, Guy R., 2010. Windows, view, and Spector, Robert, 1990. Visual fields. In: Clinical Methods : the History, Physical, and
office characteristics predict physical and psychological discomfort. J. Environ. Laboratory Examinations, third ed. Butterworths, Boston https://www.ncbi.nlm.nih
Psychol. 30 (4), 533–541. https://doi.org/10.1016/j.jenvp.2009.12.004. .gov/books/NBK220/.
Autodesk. 3ds max. n.d, Accessed December 9, 2019. https://www.autodesk.com/produc Tomasetti, Thornton, 2019. “Design explorer.” design explorer, 2019. http://core.thornt
ts/3ds-max/overview. ontomasetti.com/design-explorer/.
Baranzini, Andrea, Schaerer, Caroline, 2011. A sight for sore eyes: assessing the value of Turan, Irmak, Reinhart, Christoph, Kocher, Michael, 2019. Evaluating spatially-
view and land use in the housing market. J. Hous. Econ. 20 (3), 191–199. https:// distributed views in open plan work spaces. In: Proceedings Of the 16th IBPSA
doi.org/10.1016/j.jhe.2011.06.001. International Conference And Exhibition.
Berkeley Lab. Comfen. n.d, Accessed December 13, 2019. https://windows.lbl. Ulrich, Roger S., 1984. View through a window may influence recovery from surgery.
gov/software/comfen. Science 224, 420.
Dahnoun, Naim, 2017. Stereo vision implementation. In: In Multicore DSP, vols. 604–16. Verderber, Stephen, 1986. Dimensions of person-window transactions in the hospital
John Wiley & Sons, Chichester, UK. environment. Environment and Behavior; Beverly Hills, Calif 18 (4), 450–466.
10