0% found this document useful (0 votes)
12 views

A New Method For Visualizing and Evaluating Views in Architectural Design

This document presents a new method for visualizing and evaluating views in architectural design using Google Earth Studio. The method involves 3 steps: 1) obtaining source images from Google Earth's 3D model database, 2) calculating the view range of different window designs, and 3) projecting the view ranges back onto the source images. This allows architects to realistically simulate views from prospective buildings and evaluate designs based on calculated "View Scores". The authors tested this workflow on a case study in Cambridge, MA and believe it provides a more accurate and efficient way to incorporate views into early-stage architectural design compared to existing methods.

Uploaded by

Luis Sanizo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

A New Method For Visualizing and Evaluating Views in Architectural Design

This document presents a new method for visualizing and evaluating views in architectural design using Google Earth Studio. The method involves 3 steps: 1) obtaining source images from Google Earth's 3D model database, 2) calculating the view range of different window designs, and 3) projecting the view ranges back onto the source images. This allows architects to realistically simulate views from prospective buildings and evaluate designs based on calculated "View Scores". The authors tested this workflow on a case study in Cambridge, MA and believe it provides a more accurate and efficient way to incorporate views into early-stage architectural design compared to existing methods.

Uploaded by

Luis Sanizo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Developments in the Built Environment 1 (2020) 100005

Contents lists available at ScienceDirect

Developments in the Built Environment


journal homepage: www.editorialmanager.com/dibe/default.aspx

A new method for visualizing and evaluating views in architectural design


Wenting Li *, Holly Samuelson
Harvard University, Graduate School of Design, 48 Quincy St, Cambridge, MA, 02138, USA

A R T I C L E I N F O A B S T R A C T

Keywords: In buildings, good views to the exterior can provide positive psychological benefits and are often one of the most
Window design tool appreciated aspects of an interior space. However, the existing architectural design workflow fails to take view
Fenestration into consideration during the early design stages. This research provides two innovative outputs: first, a method to
View visualization
visualize and quantitatively evaluate real, photographic views from existing or prospective buildings and, second,
Architecture metric
Parametric design workflow
a window design workflow to help architects evaluate view simultaneously along with the results of daylight and
energy simulations. A “View Score” is calculated, in part, via a weighted average of view area and user’s view
preferences. The method could also help real estate professionals to show or valuate views for prospective buyers
of unbuilt units. The application of this workflow is tested in one case study in Cambridge, MA.

1. Introduction on two sample view images, not the actual views. The visualization and
evaluation of prospective views are not included in these existing tools.
Window design is a crucial aspect of both architectural design and Researchers have developed methods to visualize views through hand
building performance optimization. Moreover, view is one of the most drawn projection, a computer model, or a camera with fisheye lens
appreciated functions of a window in architectural design. Research has (Hellinga and Hordijk, 2014; Fontenelle and Ramalho, 2014). Recent
linked views, and in particular, views of nature or other preferred views, research has developed frameworks to evaluate view access through
with improved worker productivity (Leather et al. 1998) as well as vector raytracing methods in 3D models (Mardaljevic, 2019; Turan et al.,
positive psychological (Moore, 1981; Kaplan, 2001; Aries et al., 2010) 2019). But these methods require the user to model the surrounding
and health (Verderber, 1986; Raanaas et al., 2012) benefits for building building or visit a target site. Moreover, the resulting image is only as
occupants. For example, in a well-cited study in Science (Ulrich, 1984) detailed as the drawn or modeled input, which often is rudimentary.
correlated views of nature to reduced surgery recovery times. View also Researchers have developed view evaluation methods based on ques-
plays a significant role in the market price of real estate since people are tionnaire studies (Hellinga and Hordijk, 2014; Fontenelle and Ramalho,
often willing to pay a premium for attractive views (Damigos and Any- 2014), but existing evaluation methods only evaluate views through the
fantis, 2011; Baranzini and Schaerer, 2011). photographs taken from the already-built windows and therefore fail to
A few existing web tools can assist early-stage window design, such as help inform prospective window design options.
Façade Design Tool, the MIT Design Advisor, COMFEN and PROSOLIS This research develops a framework to visualize and evaluate real
(Efficient Efficient Windows Collaborative, 2011; Glicksman et al. 2009; views from existing or prospective window designs with less time and
Dartevelle et al., 2014; Berkeley Lab n.d.). Researchers have also devel- effort, as well as more realism. This research also proposes a workflow to
oped frameworks to consider multiple criteria like daylight, view, and visualize a large set of design options, which allows architects to see the
energy in the window design process (Reinhart and Jan, 2011; Hellinga performance trends of view, daylight, and energy in their designs to make
and Hordijk, 2014; Fontenelle and Ramalho, 2014). Yet, the importance more informed decisions. This research will ultimately help architects to
of view (i.e. the exterior context that occupants can see from indoors) to balance multiple factors holistically, in contrast to typical simulation
fenestration design has been overlooked in existing architectural design which tends to focus on a single factor at a time.
tools. At the time of writing, the authors could only find one available
tool to help designers evaluate views in the window design process — the 2. Methods
PROSOLIS web-based tool (Dartevelle et al., 2014). However, this tool
only provides view-through images for different shading systems based This paper proposes a new fenestration design workflow that can

* Corresponding author.
E-mail addresses: [email protected] (W. Li), [email protected] (H. Samuelson).

https://doi.org/10.1016/j.dibe.2020.100005
Received 25 October 2019; Received in revised form 13 December 2019; Accepted 15 January 2020
Available online 30 January 2020
2666-1659/© 2020 The Author(s). Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-
nc-nd/4.0/).
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

accomplish three things. It can visualize views for different window de-
signs, quantitatively evaluate views according to users’ preferences, and
integrate view as an input in the current architectural design process
(including daylight and energy simulation, for example).
Rather than investing time and effort into building a detailed 3D site
model, the proposed method uses model and image data from a public
database to realistically simulate outdoor views. The following section
explains this method.

2.1. View visualization

The workflow proposed here consists of three main steps to visualize


views. The first step is obtaining the source image from the model
database. The second step is calculating the view range of different
window designs. The third step is projecting the view ranges back to the
source image.
Here, the view-visualization method uses Google Earth Studio, a
relatively new tool based on Google Earth (Google LLC, 2019). Google Fig. 2. Comparison of Google Earth Studio results and on-site photographs.
Earth is a detailed 3D virtual globe built based on satellite images and
aerial photos (Google LLC n.d.), it was first released in 2005 and has been
Based on the comparison shown here, as well as similar tests, the authors
used as a virtual globe tool in both academia and industry (Liang et al.,
consider the view visualization method through Google Earth Studio
2018). Google takes photos via airplane and uses a GPS antenna to track
accurate enough for their intended purposes, and far superior to existing
the location and angle of the camera. The 3D models were built based on
architectural design methods of estimating view.
the photos taken through computer vision algorithms (Dennis, 2017; Liu
After obtaining the source images, the second step is calculating the
and Stamos, 2012). At the time of this writing, Google has updated 3D
view range (i.e. the extent of the image that can be seen) through the
imagery for more than 500 cities in the United States and more than 1,
window. This requires a method to convert window geometries into a
000 internationally (Google LLC n.d.). Google Earth Studio is currently in
FOV input for Google Earth Studio. The following equations can be used
beta testing and made available to the authors, who have no affiliation
to calculate FOV for the viewpoint at the center of the room:
with Google, through request. However, it will become publicly avail-
able. The main benefit of this tool is that it allows for customized camera Tanα ¼ W/L (1)
settings, such as camera location, camera altitude, camera tilt, and
-1
camera lens as well as a specific time of day and year. These settings FOV ¼ 2 tan (W/L) (2)
make it possible for users to set the camera inside a proposed building
where FOV – Horizontal field of view ( );
and obtain the source image immediately through Google Earth Studio.
To obtain the right source image from Google Earth Studio, the au-
W - Window width (m);
thors proposed a method based on similar triangles shown in Fig. 1. Field
L - Room length (m);
of view (FOV) is the factor to constrain view range. The authors used the
same FOV to ensure the image exported from Google Earth Studio rep-
Fig. 3 shows the method to visualize the view through different
resents the same view range from the test viewpoint.
windows. After exporting the view image with the correct FOV from
The authors used photographs taken from the author’s student dor-
Google Earth Studio, the view-through-image for different window ge-
mitory at the time and the author’s current office building to evaluate the
ometries can be generated by overlapping the elevations of different
results of Google Earth Studio. Three inputs need to be acquired to export
window designs with the unobstructed view image and then subtracting
images from Google Earth Studio: coordinates of the space, camera
the opaque elements, e.g. walls, window frames, etc. from the image.
altitude, and FOV. The authors use the coordinates of the test spaces from
In order to determine the FOV, the user must consider the window
Google Maps (Google Inc. n.d.). The camera altitude was calculated as
design and the occupant’s viewpoint. The scene from different view-
the sum of site altitude, floor height of the test space, and camera height
points are different and wall or window frame thicknesses, as well as sun
above the floor. The authors used the back camera of an iPhone 8 to take
shading structures, can obstruct the views. The following section illus-
photographs at the perimeter edge of the buildings, the camera was set
trates methods to visualize views from multiple viewpoints and to
parallel to the windows. The camera the authors used has an average
field of view of 60 (Apple, 2017).
Fig. 2 is the comparison of the photographs taken in the test spaces
(1A, 2A, 3A) and images exported from Google Earth Studio (1B, 2B, 3B).

Fig. 1. View visualization method based on similar triangles. Fig. 3. View visualization for window geometry.

2
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

consider the effect of obstructions, such as louvers or fins.


An occupant can occupy many different locations in the room leading
to infinite different view perspectives, and one could visualize and
evaluate any number of them. For example, a user could calculate the
views at known seating locations, or apply a horizontal grid of nodes, e.g.
1m by 1m, similar to the method used today in daylight simulation
(Reinhart and Jan, 2011). Here, to summarize the horizontal range of
views, the authors propose a simple five-point calculation method.
They use three viewpoints in the midline of a window to calculate the
average view range, and two points at the horizontal edges of a window
to calculate the maximal view range. Fig. 4 illustrates the different view
ranges for the five viewpoints in a room plan. Scenario Hmax represents
the maximal horizontal view range, i.e. the extents that could be seen by
an occupant traveling from one edge of the window to the other. Equa-
tions (3)–(6) can be used to calculate the FOV of scenarios H1, H2, H3,
Hmax in Fig. 4, respectively:
Fig. 5. Flexible window position for the new method.
FOVH1 ¼ 2 tan-1 [(W/ 2(L0 þL1)] (3)

FOVH2 ¼ 2 tan-1 [W/ (L0 þ2L1)] (4) calculate the view range above and below the eye level:

FOVH3 ¼ 2 tan-1 (W/ 2L1) (5) ðH þ h1  he Þ


α1 ; α4 ¼ tan1 (7)
-1 L0 þ L2
FOVHmax ¼ 2 tan (W/ L1) (6)

where FOVH- Horizontal field of view ( ); jh1  he j


β1 ; β4 ¼ tan1 (8)
L0 þ L2
W - Window width (m);
For the viewpoint in the middle of the room, example V2, Equations
L0 - Room length (m);
(9) and (10) can be used to calculate the vertical view angles:
L1 – Vertical shading depth (m);
2  ðH þ h1  he Þ
The room length here is the distance between the window and the α2 ¼ tan1 (9)
L0 þ 2L2
wall opposite to it. This five-point calculation method is valid if the three
viewpoints are in the midline of the window and the two viewpoints are 2  ðhe  h1 Þ
at the horizontal edges of the window. The window position is flexible β2 ¼ tan1 (10)
L0 þ 2L2
and is not constraint to be set at the middle of the wall (Fig. 5).
The authors propose a similar method, demonstrated in Fig. 6, to For the viewpoint at the front of the room, example V3, Equations
calculate the vertical view range. This calculation method is also valid for (11) and (12) can be used to calculate the vertical view angles:
flexible window positions. For the examples V1 and V4, where the
ðH þ h1  he Þ
viewpoint is at the back of the room, Equations (7) and (8) can be used to α3 ¼ tan1 (11)
L2

ðhe  h1 Þ
β3 ¼ tan1 (12)
L2
When the eye level is between the windowsill and window head,
examples V1, V2, and V3. The vertical field of view can then be calcu-
lated through Equation (13):

Fig. 4. Horizontal view range from different viewpoints in the plan view. Fig. 6. Vertical view range from different viewpoints in the section view.

3
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

FOVv1;v2;v3 ¼ α1;2;3 þ β1;2;3 (13)

When the eye level is below the windowsill, example 4, the vertical
field of view can then be calculated through Equation (14):

FOVv4 ¼ α4  β4 (14)

where H – Window height (sill to head) (m);

FOVv– Vertical field of view ( );


Fig. 8. Scale panoramic image to the real size and overlap window geometry.
h1 – Window sill height (m);
(Image illustrates a different window on each of four walls).
he – Eye level height (m);
L0 – Room length (m);
L2 – Horizontal shading depth (m);

After requesting the source image and calculating the view ranges,
the final step for view visualization is to project the view ranges back to
the source image. The authors propose using a 360 unobstructed
panoramic image of prospective views as the base for view visualization.
In Fig. 7, the authors generate a panoramic image from six images
exported from Google Earth Studio, each with a FOV of 60 , an angle
chosen due to its similarity with human vision (Spector, 1990; Dahnoun,
2017). The panoramic image generated by this method is accurate for
new construction, where no building exists at the target viewpoint. If
there is an existing building at the target viewpoint, the viewpoint can
only be positioned at the perimeter edge or the top of the existing
building to avoid obstruction from the building itself.
The next step is overlapping window geometries onto the panoramic
image. Since the FOV used to generate the panoramic image is 60 ,
Equation (15) can be used to calculate the real size of the panoramic
image (Fig. 8):
Fig. 9. Shoebox model for the view range script test.
Wp ¼ (W  FOVe)/FOV (15)
is at 101 m, which is the altitude of the sitting viewpoint. Then the au-
Where Wp – Width of the exported image (m);
thors exported four images with a FOV of 90 at this location/elevation to
generate the backgrounds panoramic view image, using the same method
FOV – Horizontal field of View ( );
as in Fig. 7.
FOVe – FOV used to export image ( );
Using the proposed view visualization Grasshopper script, the user
W – Window width (m);
can now visualize the view range in Rhino. Fig. 10 illustrates the results
including four view ranges which represent the minimal view range,
The authors embedded the proposed calculation methods into a
which is from the viewpoint at the back of the room (H1); the medial
Grasshopper script. Grasshopper (McNeel, 2019b; 2019a) is a computer
view range, which is seen from the viewpoint in the middle of the room
scripting plug-in for the architectural 3D modeling software Rhinoceros.
The inputs for the script include window orientation, room length (L0),
window width (W), window height (H), windowsill height (H1), eye level
height (He), vertical shading depth (L1), and horizontal shading depth
(L2). The outputs are rectangles superimposed on the panoramic image
that indicate the horizontal and vertical view range.
As a case study, the authors designed a shoebox model facing east
(Fig. 9), located at the southeast corner of Central Park in New York
(latitude 40.766 , longitude 73.969 , altitude 100 m). The window sill
height is 1.0 m and the window height is 1.4 m. The average human eye
height is 1.567 m for standing position and 1.171 m for sitting position
(NASA, 1995). These heights are with respect to the floor.
The authors then considered the case of an upper story by adding the
floor altitude (height above sea level) to the eye level height. In this case,
the altitude of the source images exported from the Google Earth Studio Fig. 10. View range results for the shoebox model.

Fig. 7. Generate 360 panoramic view image.

4
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

(H2); the front view range, which is seen from the viewpoint at the front
of the room (H3) (all aligned with the window midpoint), and the
maximal view range (Hmax), which is an amalgamation of the two
viewpoints at the horizontal window edges.

2.2. Evaluating view preference

The following sections illustrate a proposed method to quantify view


preference (and the potential benefits of quantifying a View Score are
Fig. 12. Results of view preference score calculation. (Image shows results for
discussed in Section 2.3). This step in the proposed workflow allows users
square windows facing four possible orientations, from the mid-
to input their preferences in the view evaluation process and then room viewpoint).
quantitatively compare different prospective design options related to
window orientation, geometry, etc. The inputs for the view preference
2.3. Integrating view in window design workflow
calculation include the background panoramic image, the window
boundary, and user-selected areas for desirable views, undesirable views,
After proposing the new framework to visualize and evaluate views,
and neutral views, as well as optional weighting factors for these areas, as
the authors further propose a parametric window design workflow to
described below.
integrate view, daylight, and energy into the window design process. The
The proposed view preference calculation method is based in a
important part of a multi-criteria design workflow is the results visuali-
Rhino/Grasshopper environment, via a Grasshopper script the authors
zation, and in an ideal workflow the user runs multiple parametric sim-
wrote. First, users need to draw window areas and their view preference
ulations and then can view the results of multiple design options
areas as closed surfaces on top of the panoramic image in Rhino. Then
simultaneously.
they can select the areas and assign the surfaces as “window”, “desirable
There are already some existing web tools to help the visualization
views”, and “undesirable views” in Grasshopper. The Grasshopper script
part of parametric designs. One example is Design Explorer, an online
will then calculate the areas of input surfaces. Fig. 11 illustrates example
interface that allows users to visualize and filter customized design so-
user selections of desirable and undesirable view areas, outlined by green
lutions (Tomasetti, 2019). The authors customize a workflow through
and red rectangles respectively.
Design Explorer to enable architects to visualize all prospective design
The user can then define weighting factors to determine how desir-
options and rank them based on the performance of view, daylight, or
able or undesirable the user evaluates each area within the prospective
energy. Fig. 13 illustrates the proposed window design workflow. The
scene. For example, a view to vegetation or a famous landmark could be
user can set their own targets for different categories and filter all of the
weighted to increase the score, while a view to unsightly rooftop
design scenarios to find the most desirable ones.
equipment could reduce the score.
The first step in the proposed workflow is acquiring source images
In the proposed workflow, a View Score can then be calculated for
from Google Earth Studio. The second step is importing the images into
each window design option. The score is a factor of both the sub-area of
Rhino and adjusting the size of the images according to the method in
glass and its weighting factor; all the sub-areas are then summed for the
Fig. 8. The third step is using the new tool the authors developed to
whole window. In the following sections, the authors use weighting
visualize and evaluate views of various window design options. Then one
factors of 100 for desirable views, 100 for undesirable views, and 50 for
can use other tools such as DIVA, ladybug/honeybee, Climate Studio, or
everything else. (The neutral view areas are given a positive weighting
others to conduct daylighting and energy analysis (Solemma, 2018;
factor, since generally a larger view area is desirable.)
Ladybug Tools, 2018). The analysis results can then be uploaded to
The authors wrote a Grasshopper script to quantify view preference
Design Explorer, where users can filter their window design options
scores for each window design option. Fig. 12 demonstrates the view
based on the performance of view, daylight, and energy.
preference scores, calculated via a Grasshopper script, for the four
orientation options of a 3 m  3 m facade with a 60% window-to-wall
ratio, higher scores mean more desirable views.

Fig. 11. User-defined view preference areas. (Image shows results for square
windows facing four possible orientations, from the mid-room viewpoint).

Fig. 13. Integrate view in parametric window design process.

5
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

3. Case study

The proposed workflow was tested in two case studies: one office
window to validate the proposed view visualization method; and a
parametric shoebox model to test the integrated window design work-
flow which includes three performance metrics—view, daylight, and
energy.

3.1. Validation of proposed view visualization method

To validate the proposed view visualization method, the authors Fig. 15. Validation result of view visualization method.
compare photographs taken through an office window to the output from
the proposed view visualization tool. The authors first measured the di-
Table 1
mensions of the office and the altitude of the test floor. Fig. 14 illustrates
Time spent using the proposed view visualization method.
the dimensions of the office. Then the authors input this information
along with the coordinate of the office (71.114 , 42.376 ) and the Step Description Recorded time
altitude of the floor above sea level (40.8m) into the view visualization 1 Getting coordinates and viewpoint height 3 min
tool. The proposed tool will then generate the view range at the same 2 Exporting source images from Google Earth Studio 10 min
location. The authors then took a picture parallel to the window from a 3 Using Grasshopper script to visualize view 12 min
Total time 25 min
sitting position (eye level height: 1.2m) and compared it to the output
from the proposed tool.
Fig. 15 shows the output from the proposed view visualization tool
and the photograph taken at the same place. The main building in the
view matches well with the photograph, the tree and sky in the photo-
graph are also included in the view range output. Hence this case study
helps validate the proposed view visualization method.
The authors compared the time used for the proposed method and an
example traditional modeling method to visualize the view from the
same office. Table 1 demonstrates the time spent for each step, using the
proposed method for view visualization, and the total time is 25 minutes.
The authors then recorded the time spent generating a comparable
view image through 3ds max — a modeling and rendering software
(Autodesk and max.). The model was built by an expert 3D renderer with Fig. 16. View visualization result through 3D modeling.
an architecture degree and more than two years’ experience with the
software. Fig. 16 illustrates the view visualization result through 3ds
max. Table 2 lists the description and time spent on each step using the Table 2
modeling and rendering method for view visualization, and the total time Time record for modeling and rendering method.
is 210 minutes. Step Description Recorded
According to this example case, the proposed method only used 12% time
of the time needed for the more traditional modeling and rendering 1 Modeling the building context, adding a sky image, and 90 min
method. The time difference might be even larger if the site is located in a adjusting basic render settings
high-density neighborhood, in which case the modeling time will be 2 Modeling facade details of neighboring building, adding 120 min
longer while the time used through the proposed method will stay the trees and rendering
Total time 210 min
same since no modeling is required in the proposed method. This is only

one example, and task times for the traditional method could vary greatly
depending on the user, their workflow, and the urban context. Never-
theless, the authors believe that the proposed method is much more
efficient than the existing architectural design methods of estimating
view in most new buildings.

3.2. Case study for proposed window design workflow

To test the integrated window design workflow which includes view,


daylight, and energy, the authors used a parametric model and simulated
160 window design options. The model, a proposed office located in
Cambridge, MA, is 4.5 m  3 m x 3 m. The authors used ten window
geometry options, four window-to-wall ratios (80%, 60%, 40%, 20%),
four window shapes (rectangular, square, vertical, circle), and four types
of shadings (no shading, 0.5 m overhang, 0.5 m fins, 0.5 m overhang and
fins). Fig. 17 illustrates the ten options for window geometry inputs and
the four shading inputs.
The authors then input the site location and set the camera in Google
Earth Studio. The viewpoint was set at the center of the model with a FOV
Fig. 14. Summary of the office used for validation. of 67.6 . Fig. 18 displays the view visualization results for the north- and

6
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

Fig. 17. Window geometry and shading inputs.

south-facing scenarios. beta phase. The authors proposed a method to match the field of view
The authors then calculated the view preference score according to from a target viewpoint to the camera setting in Google Earth Studio.
the proposed methods. Fig. 19 displays the view preference scores for the Using this method, users can visualize prospective views at any location
10 south-facing scenarios. The authors also conducted view access where a three-dimension model exists in Google’s database.
analysis, daylight analysis, and energy simulations for all of the 160 Compared with existing view visualization methods that require the
design scenarios for the final visualization. The daylight and energy user to model the outdoor context in 3D, the proposed view visualization
simulation results in this paper are produced by existing tools, specif- method, using Google’s existing model database, is much more efficient.
ically, Ladybug and Honeybee (an interface for the EnergyPlus engine) Moreover, modelers often find it challenging to include detailed context
respectively (Ladybug Tools, 2018; Department of EnergyNational information in a site model, hence most view images exported from a site
Renewable Energy Laboratory, 2018). model are very rudimentary. Here, the proposed view visualization
After all the simulations were run, the authors uploaded the results to method includes full context information, which includes trees, water,
the Design Explorer interface. To better visualize the overall perfor- sky, and other elements from Google’s database, allowing for more
mance, the authors also generated a spider diagram for each design realistic-looking results.
scenario. Secondly, this research proposed a new metric “View Score” to
Fig. 20 illustrates a process in which the user narrowed down 160 evaluate view quality, using areas and weighting factors to quantify user-
design scenarios to the two options that meet both the view preference of defined desirable views on a 360 panoramic image of prospective views.
70 and a maximum EUI target of 155 kWh/m2. Fig. 21 displays the two One possible application of this new view evaluation method is in
design options. building design, where an architect can use the View Score to decide the
window location and geometry in order to obtain a better view.
4. Discussion For the proposed View Score, the user can further calculate the view
scores from multiple occupant viewpoints (e.g. using the five viewpoint
This research has established a new method for view visualization, method described above or a grid of nodes). One could even visualize the
proposed a more quantifiable metric —View Score to evaluate view View Score spatially with a false-color plan overlay, similar to many
quality, and developed a window design workflow to include view in the daylighting metrics, such as "daylight autonomy”, and then calculate an
existing window design process. The following sections demonstrate the aggregate score, similar to “spatial daylight autonomy” for the room
main innovations and limitations of this research. (Reinhart and Jan, 2011).
Quantifying the View Score may not be necessary if a designer only
wishes to compare a few options. However, this automated step would
4.1. Contributions
allow users to easily compare numerous design options and to combine
these results with other outputs, such as daylighting and energy simu-
View is an important attribute of window design but is currently
lation results.
difficult to consider in the fast-phased early architectural design process.
Finally, the proposed workflow makes it possible to consider view as an
The proposed view visualization method can help architects include view
important factor in the overall window performance. The quantitative View
in the building design process with more detailed information and less
Score allows this metric to be evaluated along with other quantitative re-
modeling effort. This research started with an innovative way to visualize
sults such as simulated daylighting and energy metrics. The proposed new
prospective views through Google Earth Studio — a software still in the

7
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

Fig. 18. Prospective views for selected design options.

Fig. 19. View preference scores for south orientations.

workflow helps the user find desirable window design scenarios among dynamic view visualization method, and developing a more integrated
numerous options in a pre-simulated pool. Importantly, in the authors’ platform.
opinion, the value of the pool of numerous options is not in finding the one The source image used in this paper is from Google Earth Studio, a
optimal solution, but rather in allowing the designer to observe trends in the tool still in the beta phase and not released to the public yet. Although the
results. For example, a visual ranking of multiple results may show that for authors performed a validation for the credibility of the source image,
window orientation options within 15 degrees of south, the orientation has future work can consider a more detailed validation. Moreover, since
a more dramatic impact on view than energy performance, or vice versa. By Google is still building its database, some cities might not have the full 3D
allowing the designer to observe trends like these, they can use the infor- model yet, which restricts the use of the proposed view visualization
mation, combined with all their other design criteria not included in the methods.
simulations, to make more informed design decisions. For the view visualization method, the authors used different view-
Beyond building design, the view visualization method could also points to generate view ranges in a room. This method is based on the
provide value to real estate brokers for showing prospective views to average and extreme cases. Future work could develop a viewpoint grid,
potential buyers. Whereas generating view quality scores of different and extend the proposed method for dynamic view visualization, so the
units within a building could help brokers adjust the price of each unit user can visualize the view from a number of points in the room and a
based on the quality of views. variety of angles. In this way, the view visualization process can be more
dynamic and the View Score could be calculated and aggregated for the
whole room. The described method does not account for inside objects
4.2. Limitation and future work that block a portion of the window. However, users can model their
interior geometries and add these objects to the visualization in the same
Future goals for the proposed workflow are described below. They way that an architect would model a building’s interior today put simply,
include obtaining full access to existing software, building a more

8
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

Fig. 20. Ranking of all design options meet EUI and view preference target.

Fig. 21. Design options that meet both EUI and view preference targets.

the interior portion of the architectural modeling process remains un- person will also influence the perception of view.
changed from traditional methods. A person’s perception of view can also be different at different times.
The proposed window design workflow can be further combined into In theory, Google Earth Studio also provides time-of-day, date, and cloud
one integrated platform, using one interface instead of multiple. An on- attributes, so the user could export a photo at a specific time (Fig. 22).
line platform could even encourage users to upload their results, even- The time-of-day and date can help the user visualize sun locations.
tually providing data for designers and researchers. The spider diagram is However, in our tests, the images did not change much by adding cloud
easy to read on the online interface and can be collected to build a result attributes, or changing the dates to a different season. This might be
profile for different design scenarios. because Google’s airplane only took photos on clear days in the spring
The view image generated through the proposed method is a virtually season (Dennis, 2017). Future work can integrate new photo editing al-
constructed view based on a still person’s visual field. However, the real gorithms to change the weather and seasons (Laffont et al. 2014).
perception of view might differ since humans have a larger peripheral
visual field laterally and downward (Spector, 1990). The movement of a 4.3. Outlook

Humans, as opposed to computers, are adept at solving complex


problems, and architects are known to deal with multiple objectives
simultaneously. However, existing simulation tools usually help inform
one consideration at a time. Compared with existing tools that only give
the user one answer for each simulation, this research proposes a way to
visualize numerous design options, allowing architects to see the patterns
in their designs’ performance to make more informed final decisions.
Rather than succumbing to one “optimal” scenario limited to only one
Fig. 22. Different time in Google Earth Studio. factor, this research proposes a workflow to help architects more

9
W. Li, H. Samuelson Developments in the Built Environment 1 (2020) 100005

holistically consider trade-offs of multiple factors simultaneously, Damigos, Dimitris, Anyfantis, Fotis, 2011. The value of view through the eyes of real
estate experts: a fuzzy delphi approach. Landsc. Urban Plann. 101 (2), 171–178.
including the important consideration of view. Since designers have
https://doi.org/10.1016/j.landurbplan.2011.02.009.
multiple goals to achieve in the process of window design, this workflow Dartevelle, Olivier, Arnaud, Deneyer, Bodart, Magali, 2014. PROSOLIS: a web tool for
makes it easier to see the trends in window performance. The ultimate thermal and daylight characteristics comparison of glazing complexes. https://doi.
goal of such research is to use simulation tools with both quantitative and org/10.13140/2.1.1925.1522.
Dennis, Natalie, 2017. “Google earth’s incredible 3D imagery, explained. April 18. htt
qualitative factors to augment human design. ps://blog.google/products/earth/google-earths-incredible-3d-imagery-explained/,
2017.
Department of Energy, National Renewable Energy Laboratory, Construction Engineering
5. Conclusion Laboratory, University of Illinois, 2018. EnergyPlus, 2018. https://energyplus.net/.
Efficient Efficient Windows Collaborative, 2011. “Facade design tool.” windows for high-
This paper proposes a multi-factor window design workflow to help performance commercial buildings, 2011. https://www.commercialwindows.or
g/fdt.php.
architects visualize view, daylight, and energy consumption simulta- Fontenelle, Ramalho, Marília, Leopoldo Eurico Gonçalves Bastos, 2014. The multicriteria
neously, while also providing a method to visualize and evaluate pro- approach in the architecture conception: defining windows for an office building in
spective views. Compared with existing view visualization methods, the rio de Janeiro. Build. Environ. 74 (April), 96–105. https://doi.org/10.1016/
j.buildenv.2014.01.005.
proposed method not only requires less time for preparation but also can
Glicksman, Leon, Urban, Bryan, Ali, Zehra, Lehar, Matthew, Gouldstone, James,
generate more accurate and detailed results. A window design workflow Ray, Steve, 2009. The MIT Design Advisor, 2009. http://designadvisor.mit.edu/desi
based on Rhino and Grasshopper was also proposed, which enables an gn/.
interactive comparison of numerous design options. Ultimately, this Google Inc. Google Maps. Accessed December 7, 2019b.
https://www.google.com/maps/@40.7435462,-73.9712484,15z.
research provides a tool that can help architects make more informed Google, L.L.C., 2019. “Google Earth studio.” Google Earth studio. January 21, 2019. htt
window design decisions. ps://www.google.com/earth/studio/.
Google LLC. Google Earth. Accessed December 8, 2019a. https://earth.google.com/web/.
Hellinga, Hester, Hordijk, Truus, 2014. The D&V analysis method: a method for the
Author contributions analysis of daylight access and view quality. Build. Environ. 79 (September),
101–114. https://doi.org/10.1016/j.buildenv.2014.04.032.
Kaplan, Rachel, 2001. The nature of the view from home: psychological benefits. Environ.
Wenting Li: conceptualization, methodology, software, data cura-
Behav. 33 (4), 507–542. https://doi.org/10.1177/00139160121973115.
tion, writing—original draft preparation, writing—review and editing Ladybug Tools, 2018. “Ladybug.” ladybug tools, 2018. https://www.ladybug.tools/ab
Holly W. Samuelson:conceptualization, methodology, writing—re- out.html.
Laffont, Pierre-Yves, Ren, Zhile, Tao, Xiaofeng, Qian, Chao, Hays, James, 2014. Transient
view and editing, funding acquisition, supervision
attributes for high-level understanding and editing of outdoor scenes. ACM Trans.
Graph. 33 (4), 1–11. https://doi.org/10.1145/2601097.2601101.
Leather, Phil, Pyrgas, Mike, Di Beale, Lawrence, Claire, 1998. Windows in the workplace:
Declaration of competing interest sunlight, view, and occupational stress. Environ. Behav. 30 (6), 739–762.
Liang, Jianming, Gong, Jianhua, Li, Wenhang, 2018. “Applications and impacts of Google
Earth: a decadal review (2006–2016). ISPRS J. Photogrammetry Remote Sens. 146
The authors declare that they have no known competing financial (December), 91–107. https://doi.org/10.1016/j.isprsjprs.2018.08.019.
interests or personal relationships that could have appeared to influence Liu, Lingyun, Stamos, Ioannis, 2012. A systematic approach for 2D-image to 3D-range
the work reported in this paper. registration in urban environments. Comput. Vis. Image Understand. 116 (1), 25–37.
Mardaljevic, John, 2019. “Aperture-Based daylight modelling: introducing the `View
lumen’. In: Proceedings Of the 16th IBPSA International Conference And Exhibition.
Acknowledgements McNeel, 2019a. Grasshopper. Rhinoceros 23, 2019. September. https://www.rhino3d.co
m/6/new/grasshopper.
McNeel, 2019b. Rhino (version 6). Rhinoceros 23, 2019. September. https://www.rhi
This research was partially funded by the Harvard Graduate School of no3d.com/.
Design. The authors would like to thank Ilia Yazdanpanah for his help with Moore, Ernest O., 1981. “A prison environment’s effect on health care service demands.
J. Environ. Syst. 11 (1), 17–34. https://doi.org/10.2190/KM50-WH2K-K2D1-DM69.
the literature review on the human benefits of views and the modeling of the
NASA, July 1995. “Anthropometry and biomechanics.” man-systems integration
case study for the workflow comparison. The authors are not affiliated with standards. https://msis.jsc.nasa.gov/sections/section03.htm.
any software provider. Raanaas, Ruth, Kjaersti, Grete Grindal, Patil, Terry, Hartig, 2012. Health benefits of a
view of nature through the window: a quasi-experimental study of patients in a
residential rehabilitation center. Clin. Rehabil. 26 (1), 21–32 https://doi.org.ezp-
References prod1.hul.harvard.edu/10.1177/0269215511412800.
Reinhart, Christoph, Jan, Wienold, 2011. “The daylighting dashboard – a simulation-
Apple, 2017. “Cameras.” IOS device compatibility reference, 2017. https:// based design analysis for daylit spaces. Build. Environ. 46 (2), 386–396. https://
developer.apple.com/library/archive/documentation/DeviceInformation/Ref doi.org/10.1016/j.buildenv.2010.08.001.
erence/iOSDeviceCompatibility/Cameras/Cameras.html#//apple_ref/doc/uid/T Solemma, 2018. DIVA-for-Rhino (version 4), 2018. https://www.solemma.com/Diva.h
P40013599-CH107-SW1. tml.
Aries, Myriam B.C., Veitch, Jennifer A., Newsham, Guy R., 2010. Windows, view, and Spector, Robert, 1990. Visual fields. In: Clinical Methods : the History, Physical, and
office characteristics predict physical and psychological discomfort. J. Environ. Laboratory Examinations, third ed. Butterworths, Boston https://www.ncbi.nlm.nih
Psychol. 30 (4), 533–541. https://doi.org/10.1016/j.jenvp.2009.12.004. .gov/books/NBK220/.
Autodesk. 3ds max. n.d, Accessed December 9, 2019. https://www.autodesk.com/produc Tomasetti, Thornton, 2019. “Design explorer.” design explorer, 2019. http://core.thornt
ts/3ds-max/overview. ontomasetti.com/design-explorer/.
Baranzini, Andrea, Schaerer, Caroline, 2011. A sight for sore eyes: assessing the value of Turan, Irmak, Reinhart, Christoph, Kocher, Michael, 2019. Evaluating spatially-
view and land use in the housing market. J. Hous. Econ. 20 (3), 191–199. https:// distributed views in open plan work spaces. In: Proceedings Of the 16th IBPSA
doi.org/10.1016/j.jhe.2011.06.001. International Conference And Exhibition.
Berkeley Lab. Comfen. n.d, Accessed December 13, 2019. https://windows.lbl. Ulrich, Roger S., 1984. View through a window may influence recovery from surgery.
gov/software/comfen. Science 224, 420.
Dahnoun, Naim, 2017. Stereo vision implementation. In: In Multicore DSP, vols. 604–16. Verderber, Stephen, 1986. Dimensions of person-window transactions in the hospital
John Wiley & Sons, Chichester, UK. environment. Environment and Behavior; Beverly Hills, Calif 18 (4), 450–466.

10

You might also like