0% found this document useful (0 votes)
73 views

Android Ui Layouts: Textview Imageview Edittext Radiobutton

The document discusses Android UI layouts and how they are used to define the user interface for Android applications. It explains that layouts combine Views and ViewGroups to arrange interface components hierarchically. Common layout types include LinearLayout, RelativeLayout, and ConstraintLayout. Layouts can be defined in XML files and loaded from activities, or programmatically in Kotlin code. Attributes like android:layout_width and android:layout_height specify properties of Views and ViewGroups.

Uploaded by

V MERIN SHOBI
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views

Android Ui Layouts: Textview Imageview Edittext Radiobutton

The document discusses Android UI layouts and how they are used to define the user interface for Android applications. It explains that layouts combine Views and ViewGroups to arrange interface components hierarchically. Common layout types include LinearLayout, RelativeLayout, and ConstraintLayout. Layouts can be defined in XML files and loaded from activities, or programmatically in Kotlin code. Attributes like android:layout_width and android:layout_height specify properties of Views and ViewGroups.

Uploaded by

V MERIN SHOBI
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Android UI Layouts

Android Layout is used to define the user interface that holds the UI controls or
widgets that will appear on the screen of an android application or activity screen.
Generally, every application is a combination of View and ViewGroup. As we know,
an android application contains a large number of activities and we can say each
activity is one page of the application. So, each activity contains multiple user
interface components and those components are the instances of the View and
ViewGroup. All the elements in a layout are built using a hierarchy
of View and ViewGroup objects.

View

A View is defined as the user interface which is used to create interactive UI


components such as TextView, ImageView, EditText, RadioButton, etc., and is
responsible for event handling and drawing. They are Generally Called Widgets.

View

A ViewGroup act as a base class for layouts and layouts parameters that hold other
Views or ViewGroups and to define the layout properties. They are Generally Called
layouts.

ViewGroup

The Android framework will allow us to use UI elements or widgets in two ways:  
 Use UI elements in the XML file
 Create elements in the Kotlin file dynamically
Types of Android Layout

 Android Linear Layout: LinearLayout is a ViewGroup subclass, used to


provide child View elements one by one either in a particular direction either
horizontally or vertically based on the orientation property.
 Android Relative Layout: RelativeLayout is a ViewGroup subclass, used to
specify the position of child View elements relative to each other like (A to the
right of B) or relative to the parent (fix to the top of the parent).
 Android Constraint Layout: ConstraintLayout is a ViewGroup subclass, used
to specify the position of layout constraints for every child View relative to other
views present. A ConstraintLayout is similar to a RelativeLayout, but having more
power.
 Android Frame Layout: FrameLayout is a ViewGroup subclass, used to
specify the position of View elements it contains on the top of each other to
display only a single View inside the FrameLayout.
 Android Table Layout: TableLayout is a ViewGroup subclass, used to
display the child View elements in rows and columns.
 Android Web View: WebView is a browser that is used to display the web
pages in our activity layout.
 Android ListView: ListView is a ViewGroup, used to display scrollable lists
of items in a single column.
 Android Grid View: GridView is a ViewGroup that is used to display a
scrollable list of items in a grid view of rows and columns. 

Use UI Elements in the XML file

Here, we can create a layout similar to web pages. The XML layout file contains at
least one root element in which additional layout elements or widgets can be added to
build a View hierarchy. Following is the example:  

 XML

<?xml version="1.0" encoding="utf-8"?>

<LinearLayout

    xmlns:android="http:// schemas.android.com/apk/res/android"

    xmlns:tools="http:// schemas.android.com/tools"
    android:orientation="vertical"

    android:layout_width="match_parent"

    android:layout_height="match_parent"

    tools:context=".MainActivity">

    <!--EditText with id editText-->

    <EditText

        android:id="@+id/editText"

        android:layout_width="match_parent"

        android:layout_height="wrap_content"

        android:layout_margin="16dp"

        android:hint="Input"

        android:inputType="text"/>

    <!--Button with id showInput-->

    <Button

        android:id="@+id/showInput"

        android:layout_width="wrap_content"

        android:layout_height="wrap_content"

        android:layout_gravity="center_horizontal"

        android:text="show"
        android:backgroundTint="@color/colorPrimary"

        android:textColor="@android:color/white"/>

</LinearLayout>

Load XML Layout File and its elements from an Activity

When we have created the layout, we need to load the XML layout resource from our
activity onCreate() callback method and access the UI element from the XML
using findViewById. 
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)

// finding the button


val showButton = findViewById<Button>(R.id.showInput)

// finding the edit text


val editText = findViewById<EditText>(R.id.editText)
Here, we can observe the above code and finds out that we are calling our layout using
the setContentView method in the form of R.layout.activity_main. Generally,
during the launch of our activity, the onCreate() callback method will be called by the
android framework to get the required layout for an activity. 

Create elements in the Kotlin file Dynamically

We can create or instantiate UI elements or widgets during runtime by using the


custom View and ViewGroup objects programmatically in the Kotlin file. Below is
the example of creating a layout using LinearLayout to hold an EditText and a Button
in an activity programmatically. 

 Kotlin
import android.os.Bundle

import android.widget.Button

import android.widget.EditText

import android.widget.LinearLayout

import android.widget.Toast

import androidx.appcompat.app.AppCompatActivity

class MainActivity : AppCompatActivity() {

    override fun onCreate(savedInstanceState: Bundle?) {

        super.onCreate(savedInstanceState)

        setContentView(R.layout.activity_main)

        // create the button

        val showButton = Button(this)

        showButton.setText("Submit")

        // create the editText

        val editText = EditText(this)

        val linearLayout = findViewById<LinearLayout>(R.id.l_layout)


        linearLayout.addView(editText)

        linearLayout.addView(showButton)

        // Setting On Click Listener

        showButton.setOnClickListener

        {

            // Getting the user input

            val text = editText.text

            // Showing the user input

            Toast.makeText(this, text, Toast.LENGTH_SHORT).show()

        }

    }

Different Attribute of the Layouts

XML attributes Description

android:id Used to specify the id of the view.

Used to declare the width of View and ViewGroup


android:layout_width elements in the layout.

Used to declare the height of View and ViewGroup


android:layout_height elements in the layout.
XML attributes Description

Used to declare the extra space used on the left side of


android:layout_marginLeft View and ViewGroup elements.

Used to declare the extra space used on the right side of


android:layout_marginRight View and ViewGroup elements.

Used to declare the extra space used in the top side of


android:layout_marginTop View and ViewGroup elements.

android:layout_marginBotto Used to declare the extra space used in the bottom side
m of View and ViewGroup elements.

Used to define how child Views are positioned in the


android:layout_gravity layout.

VoiceXML
VoiceXML (VXML) is a digital document standard for specifying interactive media and voice
dialogs between humans and computers. It is used for developing audio and voice response
applications, such as banking systems and automated customer service portals. VoiceXML
applications are developed and deployed in a manner analogous to how a web
browser interprets and visually renders the Hypertext Markup Language (HTML) it receives from
a web server. VoiceXML documents are interpreted by a voice browser and in common
deployment architectures, users interact with voice browsers via the public switched telephone
network (PSTN).
The VoiceXML document format is based on Extensible Markup Language (XML). It is a
standard developed by the World Wide Web Consortium (W3C).
VoiceXML applications are commonly used in many industries and segments of commerce.
These applications include order inquiry, package tracking, driving directions, emergency
notification, wake-up, flight tracking, voice access to email, customer relationship management,
prescription refilling, audio news magazines, voice dialing, real-estate information and
national directory assistance applications.[citation needed]
VoiceXML has tags that instruct the voice browser to provide speech synthesis,
automatic speech recognition, dialog management, and audio playback. The following is an
example of a VoiceXML document:

<vxml version="2.0" xmlns="http://www.w3.org/2001/vxml">


<form>
<block>
<prompt>
Hello world!
</prompt>
</block>
</form>
</vxml>
When interpreted by a VoiceXML interpreter this will output "Hello world" with synthesized
speech.
Typically, HTTP is used as the transport protocol for fetching VoiceXML pages. Some
applications may use static VoiceXML pages, while others rely on dynamic VoiceXML page
generation using an application server like Tomcat, Weblogic, IIS, or WebSphere.
Historically, VoiceXML platform vendors have implemented the standard in different ways, and
added proprietary features. But the VoiceXML 2.0 standard, adopted as a W3C
Recommendation on 16 March 2004, clarified most areas of difference. The VoiceXML Forum,
an industry group promoting the use of the standard, provides a conformance testing process
that certifies vendors' implementations as conformant.

History[edit]
AT&T Corporation, IBM, Lucent, and Motorola formed the VoiceXML Forum in March 1999, in
order to develop a standard markup language for specifying voice dialogs. By September 1999
the Forum released VoiceXML 0.9 for member comment, and in March 2000 they published
VoiceXML 1.0. Soon afterwards, the Forum turned over the control of the standard to the W3C.
[1]
 The W3C produced several intermediate versions of VoiceXML 2.0, which reached the final
"Recommendation" stage in March 2004. [2]
VoiceXML 2.1 added a relatively small set of additional features to VoiceXML 2.0, based on
feedback from implementations of the 2.0 standard. It is backward compatible with VoiceXML 2.0
and reached W3C Recommendation status in June 2007. [3]

Future versions of the standard[edit]


VoiceXML 3.0 will be the next major release of VoiceXML, with new major features. It includes a
new XML statechart description language called SCXML.

Related standards[edit]
The W3C's Speech Interface Framework also defines these other standards closely associated
with VoiceXML.

SRGS and SISR[edit]


The Speech Recognition Grammar Specification (SRGS) is used to tell the speech recognizer
what sentence patterns it should expect to hear: these patterns are called grammars. Once the
speech recognizer determines the most likely sentence it heard, it needs to extract the semantic
meaning from that sentence and return it to the VoiceXML interpreter. This semantic
interpretation is specified via the Semantic Interpretation for Speech Recognition (SISR)
standard. SISR is used inside SRGS to specify the semantic results associated with the
grammars, i.e., the set of ECMAScript assignments that create the semantic structure returned
by the speech recognizer.

SSML[edit]
The Speech Synthesis Markup Language (SSML) is used to decorate textual prompts with
information on how best to render them in synthetic speech, for example which speech
synthesizer voice to use or when to speak louder or softer.

PLS[edit]
The Pronunciation Lexicon Specification (PLS) is used to define how words are pronounced. The
generated pronunciation information is meant to be used by both speech recognizers and
speech synthesizers in voice browsing applications.

CCXML[edit]
The Call Control eXtensible Markup Language (CCXML) is a complementary W3C standard. A
CCXML interpreter is used on some VoiceXML platforms to handle the initial call setup between
the caller and the voice browser, and to provide telephony services like call transfer and
disconnect to the voice browser. CCXML can also be used in non-VoiceXML contexts.

MSML, MSCML, MediaCTRL[edit]


In media server applications, it is often necessary for several call legs to interact with each other,
for example in a multi-party conference. Some deficiencies were identified in VoiceXML for this
application and so companies designed specific scripting languages to deal with this
environment. The Media Server Markup Language (MSML) was Convedia's solution, and Media
Server Control Markup Language (MSCML) was Snowshore's solution. Snowshore is now
owned by Dialogic and Convedia is now owned by Radisys. These languages also contain
'hooks' so that external scripts (like VoiceXML) can run on call legs where  IVR functionality is
required.
There was an IETF working group called mediactrl ("media control") that was working on a
successor for these scripting systems, which it is hoped will progress to an open and widely
adopted standard.[4] The mediactrl working group concluded in 2013. [5]

You might also like