928 resultados para PRACTICAL APPLICATIONS
Resumo:
Lately, the mobile data market has moved into a growth stage triggered by two facts: affordability of mobile broadband, and availability of data-friendly devices. At this stage, market growth is no longer dependent on push strategies from suppliers; on the contrary, demand is now driving the market. However, it will not be easy for mobile operating companies to cope up with the demand to come in the near future. The infrastructure that is needed to support corresponding demand is far from completion. Operators are forced to make heavy investments to upgrade and expand their networks. To decide how to handle the present and upcoming demand, they need to identify and understand the characteristics of the scenarios they face. This is precisely the aim of this article, which provides figures on the consequences for mobile infrastructures of a generalised mobile media uptake. Data from the Spanish mobile deployment case have been used to arrive at practical figures and illustration of results, but the conclusions are easily extended to other countries and regions
Resumo:
Many computer vision and human-computer interaction applications developed in recent years need evaluating complex and continuous mathematical functions as an essential step toward proper operation. However, rigorous evaluation of this kind of functions often implies a very high computational cost, unacceptable in real-time applications. To alleviate this problem, functions are commonly approximated by simpler piecewise-polynomial representations. Following this idea, we propose a novel, efficient, and practical technique to evaluate complex and continuous functions using a nearly optimal design of two types of piecewise linear approximations in the case of a large budget of evaluation subintervals. To this end, we develop a thorough error analysis that yields asymptotically tight bounds to accurately quantify the approximation performance of both representations. It provides an improvement upon previous error estimates and allows the user to control the trade-off between the approximation error and the number of evaluation subintervals. To guarantee real-time operation, the method is suitable for, but not limited to, an efficient implementation in modern Graphics Processing Units (GPUs), where it outperforms previous alternative approaches by exploiting the fixed-function interpolation routines present in their texture units. The proposed technique is a perfect match for any application requiring the evaluation of continuous functions, we have measured in detail its quality and efficiency on several functions, and, in particular, the Gaussian function because it is extensively used in many areas of computer vision and cybernetics, and it is expensive to evaluate.
Resumo:
The Department of Structural Analysis of the University of Santander has been for a longtime involved in the solution of the country´s practical engineering problems. Some of these have required the use of non-conventional methods of analysis, in order to achieve adequate engineering answers. As an example of the increasing application of non-linear computer codes in the nowadays engineering practice, some cases will be briefly presented. In each case, only the main features of the problem involved and the solution used to solve it will be shown
Resumo:
Applied colorimetry is an important module in the program of the elective subject "Colour Science: industrial applications”. This course is taught in the Optics and Optometry Degree and it has been used as a testing for the application of new teaching and assessment techniques consistent with the new European Higher Education Area. In particular, the main objective was to reduce the attendance to lessons and encourage the individual and collective work of students. The reason for this approach is based on the idea that students are able to work at their own learning pace. Within this dynamic work, we propose online lab practice based on Excel templates that our research group has developed ad-hoc for different aspects of colorimetry, such as conversion to different colour spaces, calculation of perceptual descriptors (hue, saturation, lightness), calculation of colour differences, colour matching dyes, etc. The practice presented in this paper is focused on the learning of colour differences. The session is based on a specific Excel template to compute the colour differences and to plot different graphs with these colour differences defined at different colour spaces: CIE ΔE, CIE ΔE94 and the CIELAB colour space. This template is implemented on a website what works by addressing the student work at a proper and organized way. The aim was to unify all the student work from a website, therefore the student is able to learn in an autonomous and sequential way and in his own pace. To achieve this purpose, all the tools, links and documents are collected for each different proposed activity to achieve guided specific objectives. In the context of educational innovation, this type of website is normally called WebQuest. The design of a WebQuest is established according to the criteria of usability and simplicity. There are great advantages of using WebQuests versus the toolbox “Campus Virtual” available in the University of Alicante. The Campus Virtual is an unfriendly environment for this specific purpose as the activities are organized in different sectors depending on whether the activity is a discussion, an activity, a self-assessment or the download of materials. With this separation, it is more difficult that the student follows an organized sequence. However, our WebQuest provides a more intuitive graphical environment, and besides, all the tasks and resources needed to complete them are grouped and organized according to a linear sequence. In this way, the student guided learning is optimized. Furthermore, with this simplification, the student focuses on learning and not to waste resources. Finally, this tool has a wide set of potential applications: online courses of colorimetry applied for postgraduate students, Open Course Ware, etc.
Resumo:
Current model-driven Web Engineering approaches (such as OO-H, UWE or WebML) provide a set of methods and supporting tools for a systematic design and development of Web applications. Each method addresses different concerns using separate models (content, navigation, presentation, business logic, etc.), and provide model compilers that produce most of the logic and Web pages of the application from these models. However, these proposals also have some limitations, especially for exchanging models or representing further modeling concerns, such as architectural styles, technology independence, or distribution. A possible solution to these issues is provided by making model-driven Web Engineering proposals interoperate, being able to complement each other, and to exchange models between the different tools. MDWEnet is a recent initiative started by a small group of researchers working on model-driven Web Engineering (MDWE). Its goal is to improve current practices and tools for the model-driven development of Web applications for better interoperability. The proposal is based on the strengths of current model-driven Web Engineering methods, and the existing experience and knowledge in the field. This paper presents the background, motivation, scope, and objectives of MDWEnet. Furthermore, it reports on the MDWEnet results and achievements so far, and its future plan of actions.
Resumo:
The development of transversal competencies provides an integral education. However, its practical implementation among different subjects is not a trivial task. There are several issues that should be previously solved in an optimal way to take advantage of the synergy among subjects. Main issues are: i) the need for a common space for the documents management, ii) the availability of the document everywhere and anytime, and iii) the possibility to collaborate in the documents edition tasks. It was implemented a virtual portfolio for the students which allows the assessment of all the subjects in a global way. To this goal we used the Google apps due to its free access, availability and suitability for the collaborative editing tasks.
Resumo:
This document presents the work that was elaborated at the company Present Technologies as part of the academic discipline Internship/Industrial Project for the Master’s degree in Informatics and Systems, Software Development branch, at Instituto Superior de Engenharia de Coimbra. The area of the mobile web applications has grown exponentially over the last few years turning it into a very dynamic field where new development platforms and frameworks are constantly emerging. Thus, the internship consisted in the study of two new mobile operating systems, Tizen and Firefox OS, as well as two frameworks for packaging of mobile web applications – Adobe PhoneGap and Appcelerator Titanium. These platforms are in the direct interest of Present Technology since it pretends to use them in its future projects in general and in the Phune Gaming project in particular. Since Television is one of the Present Technologies’ business areas, during the course of the internship it was decided to perform additionally a study of two Smart TV platforms, namely Samsung Smart TV and Opera TV, which was considered as a valuable knowledge for the company. For each of the platforms was performed a study about its architecture, supported standards and the development tools that are provided, nevertheless the focus was on the applications and for this reason a practical case study was conducted. The case studies consisted in the creation of a prototype or packaging of an application, for the case of the packaging tools, in order to prove the feasibility of the applications for the Present Technologies’ needs. The outcome of the work performed during the internship is that it raised the awareness of Present Technology of the studied platforms, providing it with prototypes and written documentation for the platforms’ successful usage in future projects.
Resumo:
Kalman inverse filtering is used to develop a methodology for real-time estimation of forces acting at the interface between tyre and road on large off-highway mining trucks. The system model formulated is capable of estimating the three components of tyre-force at each wheel of the truck using a practical set of measurements and inputs. Good tracking is obtained by the estimated tyre-forces when compared with those simulated by an ADAMS virtual-truck model. A sensitivity analysis determines the susceptibility of the tyre-force estimates to uncertainties in the truck's parameters.
Resumo:
Entanglement purification protocols play an important role in the distribution of entangled systems, which is necessary for various quantum information processing applications. We consider the effects of photodetector efficiency and bandwidth, channel loss and mode mismatch on the operation of an optical entanglement purification protocol. We derive necessary detector and mode-matching requirements to facilitate practical operation of such a scheme, without having to resort to destructive coincidence-type demonstrations.
Resumo:
This thesis presents an investigation, of synchronisation and causality, motivated by problems in computational neuroscience. The thesis addresses both theoretical and practical signal processing issues regarding the estimation of interdependence from a set of multivariate data generated by a complex underlying dynamical system. This topic is driven by a series of problems in neuroscience, which represents the principal background motive behind the material in this work. The underlying system is the human brain and the generative process of the data is based on modern electromagnetic neuroimaging methods . In this thesis, the underlying functional of the brain mechanisms are derived from the recent mathematical formalism of dynamical systems in complex networks. This is justified principally on the grounds of the complex hierarchical and multiscale nature of the brain and it offers new methods of analysis to model its emergent phenomena. A fundamental approach to study the neural activity is to investigate the connectivity pattern developed by the brain’s complex network. Three types of connectivity are important to study: 1) anatomical connectivity refering to the physical links forming the topology of the brain network; 2) effective connectivity concerning with the way the neural elements communicate with each other using the brain’s anatomical structure, through phenomena of synchronisation and information transfer; 3) functional connectivity, presenting an epistemic concept which alludes to the interdependence between data measured from the brain network. The main contribution of this thesis is to present, apply and discuss novel algorithms of functional connectivities, which are designed to extract different specific aspects of interaction between the underlying generators of the data. Firstly, a univariate statistic is developed to allow for indirect assessment of synchronisation in the local network from a single time series. This approach is useful in inferring the coupling as in a local cortical area as observed by a single measurement electrode. Secondly, different existing methods of phase synchronisation are considered from the perspective of experimental data analysis and inference of coupling from observed data. These methods are designed to address the estimation of medium to long range connectivity and their differences are particularly relevant in the context of volume conduction, that is known to produce spurious detections of connectivity. Finally, an asymmetric temporal metric is introduced in order to detect the direction of the coupling between different regions of the brain. The method developed in this thesis is based on a machine learning extensions of the well known concept of Granger causality. The thesis discussion is developed alongside examples of synthetic and experimental real data. The synthetic data are simulations of complex dynamical systems with the intention to mimic the behaviour of simple cortical neural assemblies. They are helpful to test the techniques developed in this thesis. The real datasets are provided to illustrate the problem of brain connectivity in the case of important neurological disorders such as Epilepsy and Parkinson’s disease. The methods of functional connectivity in this thesis are applied to intracranial EEG recordings in order to extract features, which characterize underlying spatiotemporal dynamics before during and after an epileptic seizure and predict seizure location and onset prior to conventional electrographic signs. The methodology is also applied to a MEG dataset containing healthy, Parkinson’s and dementia subjects with the scope of distinguishing patterns of pathological from physiological connectivity.
Resumo:
This thesis describes a detailed study of advanced fibre grating devices using Bragg (FBG) and long-period (LPG) structures and their applications in optical communications and sensing. The major contributions presented in this thesis are summarised below. One of the most important contributions from the research work presented in this thesis is a systematic theoretical study of many distinguishing structures of fibre gratings. Starting from the Maxwell equations, the coupled-mode equations for both FBG and LPG were derived and the mode-overlap factor was analytically discussed. Computing simulation programmes utilising matrix transform method based on the models built upon the coupled-mode equations were developed, enabling simulations of spectral response in terms of reflectivity, bandwidth, sidelobes and dispersion of gratings of different structures including uniform and chirped, phase-shifted, Moiré, sampled Bragg gratings, phase-shifted and cascaded long-period gratings. Although the majority of these structures were modelled numerically, analytical expressions for some complex structures were developed with a clear physical picture. Several apodisation functions were proposed to improve sidelobe suppression, which guided effective production of practical devices for demanding applications. Fibre grating fabrication is the other major part involved in the Ph.D. programme. Both the holographic and scan-phase-mask methods were employed to fabricate Bragg and long-period gratings of standard and novel structures. Significant improvements were particularly made in the scan-phase-mask method to enable the arbitrarily tailoring of the spectral response of grating devices. Two specific techniques - slow-shifting and fast-dithering the phase-mask implemented by a computer controlled piezo - were developed to write high quality phase-shifted, sampled and apodised gratings. A large number of LabVIEW programmes were constructed to implement standard and novel fabrication techniques. In addition, some fundamental studies of grating growth in relating to the UV exposure and hydrogenation induced index were carried out. In particular, Type IIa gratings in non-hydrogenated B/Ge co-doped fibres and a re-generated grating in hydrogenated B/Ge fibre were investigated, showing a significant observation of thermal coefficient reduction. Optical sensing applications utilising fibre grating devices form the third major part of the research work presented in this thesis. Several experiments of novel sensing and sensing-demodulating were implemented. For the first time, an intensity and wavelength dual-coding interrogation technique was demonstrated showing significantly enhanced capacity of grating sensor multiplexing. Based on the mode-splitting measurement, instead of using conventional wavelength-shifting detection technique, successful demonstrations were also made for optical load and bend sensing of ultra-high sensitivity employing LPG structures. In addition, edge-filters and low-loss high-rejection bandpass filters of 50nm stop-band were fabricated for application in optical sensing and high-speed telecommunication systems
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT This thesis describes a detailed study of advanced optical fibre sensors based on fibre Bragg grating (FBG), tilted fibre Bragg grating (TFBG) and long-period grating (LPG) and their applications in optical communications and sensing. The major contributions presented in this thesis are summarised below.The most important contribution from the research work presented in this thesis is the implementation of in-fibre grating based refractive index (RI) sensors, which could be the good candidates for optical biochemical sensing. Several fibre grating based RI sensors have been proposed and demonstrated by exploring novel grating structures and different fibre types, and employing efficient hydrofluoric acid etching technique to enhance the RI sensitivity. All the RI devices discussed in this thesis have been used to measure the concentration of sugar solution to simulate the chemical sensing. Efforts have also been made to overcome the RI-temperature cross-sensitivity for practical application. The demonstrated in-fibre grating based RI sensors could be further implemented as potential optical biosensors by applying bioactive coatings to realise high bio-sensitivity and bio-selectivity.Another major contribution of this thesis is the application of TFBGs. A prototype interrogation system by the use of TFBG with CCD-array was implemented to perform wavelength division multiplexing (WDM) interrogation around 800nm wavelength region with the advantages of compact size, fast detection speed and low-cost. As a high light, a novel in-fibre twist sensors utilising strong polarisation dependant coupling behaviour of an 81°-TFBG was presented to demonstrate the high torsion sensitivity and capability of direction recognition.
Resumo:
Formative measurement has seen increasing acceptance in organizational research since the turn of the 21st Century. However, in more recent times, a number of criticisms of the formative approach have appeared. Such work argues that formatively-measured constructs are empirically ambiguous and thus flawed in a theory-testing context. The aim of the present paper is to examine the underpinnings of formative measurement theory in light of theories of causality and ontology in measurement in general. In doing so, a thesis is advanced which draws a distinction between reflective, formative, and causal theories of latent variables. This distinction is shown to be advantageous in that it clarifies the ontological status of each type of latent variable, and thus provides advice on appropriate conceptualization and application. The distinction also reconciles in part both recent supportive and critical perspectives on formative measurement. In light of this, advice is given on how most appropriately to model formative composites in theory-testing applications, placing the onus on the researcher to make clear their conceptualization and operationalisation.
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.
Resumo:
All aspects of the concept of collocation – the phenomenon whereby words naturally tend to occur in the company of a restricted set of other words – are covered in this book. It deals in detail with the history of the word collocation, the concepts associated with it and its use in a linguistic context. The authors show the practical means by which the collocational behaviour of words can be explored using illustrative computer programs and examine applications in teaching, lexicography and natural language processing that use collocation in formation. The book investigates the place that collocation occupies in theories of language and provides a thoroughly comprehensive and up-to-date survey of the current position of collocation in language studies and applied linguistics. This text presents a comprehensive description of collocation, covering both the theoretical and practical background and the implications and applications of the concept as language model and analytical tool. It provides a definitive survey of currently available techniques and a detailed description of their implementation.