960 resultados para new methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The civil engineering industry generally regards new methods and technology with a high amount of scepticism, preferring to use traditional and trusted methods. During the 1980s competition for civil engineering consultancy work in the world has become fierce. Halcrow recognised the need to maintain and improve their competitive edge over other consultants. The use of new technology in the form of microcomputers was seen to be one method to maintain and improve their repuation in the world. This thesis examines the role of microcomputers in civil engineering consultancy with particular reference to overseas projects. The involvement of civil engineers with computers, both past and present, has been investigated and a survey of the use of microcomputers by consultancies was carried out, the results are presented and analysed. A resume of the state-of-the-art of microcomputer technology was made. Various case studies were carried out in order to examine the feasibility of using microcomputers on overseas projects. One case study involved the examination of two projects in Bangladesh and is used to illustrate the requirements and problems encountered in such situations. Two programming applications were undertaken, a dynamic programming model of a single site reservoir and the simulation of the Bangladesh gas grid system. A cost-benefit analysis of a water resources project using microcomputers in the Aguan Valley, Honduras was carried out. Although the initial cost of microcomputers is often small, the overall costs can prove to be very high and are likely to exceed the costs of traditional computer methods. A planned approach for the use of microcomputers is essential in order to reap the expected benefits and recommendations for the implementation of such an approach are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Firstly, we numerically model a practical 20 Gb/s undersea configuration employing the Return-to-Zero Differential Phase Shift Keying data format. The modelling is completed using the Split-Step Fourier Method to solve the Generalised Nonlinear Schrdinger Equation. We optimise the dispersion map and per-channel launch power of these channels and investigate how the choice of pre/post compensation can influence the performance. After obtaining these optimal configurations, we investigate the Bit Error Rate estimation of these systems and we see that estimation based on Gaussian electrical current systems is appropriate for systems of this type, indicating quasi-linear behaviour. The introduction of narrower pulses due to the deployment of quasi-linear transmission decreases the tolerance to chromatic dispersion and intra-channel nonlinearity. We used tools from Mathematical Statistics to study the behaviour of these channels in order to develop new methods to estimate Bit Error Rate. In the final section, we consider the estimation of Eye Closure Penalty, a popular measure of signal distortion. Using a numerical example and assuming the symmetry of eye closure, we see that we can simply estimate Eye Closure Penalty using Gaussian statistics. We also see that the statistics of the logical ones dominates the statistics of the logical ones dominates the statistics of signal distortion in the case of Return-to-Zero On-Off Keying configurations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A real-time three-dimensional (3D) object sensing and reconstruction scheme is presented that can be applied on any arbitrary corporeal shape. Operation is demonstrated on several calibrated objects. The system uses curvature sensors based upon in-line fiber Bragg gratings encapsulated in a low-temperature curing synthetic silicone. New methods to quantitatively evaluate the performance of a 3D object-sensing scheme are developed and appraised. It is shown that the sensing scheme yields a volumetric error of 1% to 9%, depending on the object.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent advances in our ability to watch the molecular and cellular processes of life in action-such as atomic force microscopy, optical tweezers and Forster fluorescence resonance energy transfer-raise challenges for digital signal processing (DSP) of the resulting experimental data. This article explores the unique properties of such biophysical time series that set them apart from other signals, such as the prevalence of abrupt jumps and steps, multi-modal distributions and autocorrelated noise. It exposes the problems with classical linear DSP algorithms applied to this kind of data, and describes new nonlinear and non-Gaussian algorithms that are able to extract information that is of direct relevance to biological physicists. It is argued that these new methods applied in this context typify the nascent field of biophysical DSP. Practical experimental examples are supplied.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Java software or libraries can evolve via subclassing. Unfortunately, subclassing may not properly support code adaptation when there are dependencies between classes. More precisely, subclassing in collections of related classes may require reimplementation of otherwise valid classes. This problem is defined as the subclassing anomaly, which is an issue when software evolution or code reuse is a goal of the programmer who is using existing classes. Object Teams offers an implicit fix to this problem and is largely compatible with the existing JVMs. In this paper, we evaluate how well Object Teams succeeds in providing a solution for a complex, real world project. Our results indicate that while Object Teams is a suitable solution for simple examples, it does not meet the requirements for large scale projects. The reasons why Object Teams fails in certain usages may prove useful to those who create linguistic modifications in languages or those who seek new methods for code adaptation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents an innovative approach for enhancing digital libraries functionalities. An innovative distributed architecture involving digital libraries for effective and efficient knowledge sharing was developed. In the frame of this architecture semantic services were implemented, offering multi language and multi culture support, adaptability and knowledge resources recommendation, based on the use of ontologies, metadata and user modeling. New methods for teacher education using digital libraries and knowledge sharing were developed. These new methods were successfully applied in more than 15 pilot experiments in seven European countries, with more than 3000 teachers trained.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This chapter contributes to the anthology on learning to research - researching to learn because it emphases a need to design curricula that enables living research, and on-going researcher development, rather than one that restricts student and staff activities, within a marketised approach towards time. In recent decades higher education (HE) has come to be valued for its contribution to the global economy. Referred to as the neo-liberal university, a strong prioritisation has been placed on meeting the needs of industry by providing a better workforce. This perspective emphasises the role of a degree in HE to secure future material affluence, rather than to study as an on-going investment in the self (Molesworth , Nixon & Scullion, 2009: 280). Students are treated primarily as consumers in this model, where through their tuition fees they purchase a product, rather than benefit from the transformative potential university education offers for the whole of life.Given that HE is now measured by the numbers of students it attracts, and later places into well-paid jobs, there is an intense pressure on time, which has led to a method where the learning experiences of students are broken down into discrete modules. Whilst this provides consistency, students can come to view research processes in a fragmented way within the modular system. Topics are presented chronologically, week-by-week and students simply complete a set of tasks to ‘have a degree’, rather than to ‘be learners’ (Molesworth , Nixon & Scullion, 2009: 277) who are living their research, in relation to their own past, present and future. The idea of living research in this context is my own adaptation of an approach suggested by C. Wright Mills (1959) in The Sociological Imagination. Mills advises that successful scholars do not split their work from the rest of their lives, but treat scholarship as a choice of how to live, as well as a choice of career. The marketised slant in HE thus creates a tension firstly, for students who are learning to research. Mills would encourage them to be creative, not instrumental, in their use of time, yet they are journeying through a system that is structured for a swift progression towards a high paid job, rather than crafted for reflexive inquiry, that transforms their understanding throughout life. Many universities are placing a strong focus on discrete skills for student employability, but I suggest that embedding the transformative skills emphasised by Mills empowers students and builds their confidence to help them make connections that aid their employability. Secondly, the marketised approach creates a problem for staff designing the curriculum, if students do not easily make links across time over their years of study and whole programmes. By researching to learn, staff can discover new methods to apply in their design of the curriculum, to help students make important and creative connections across their programmes of study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Microposts are small fragments of social media content that have been published using a lightweight paradigm (e.g. Tweets, Facebook likes, foursquare check-ins). Microposts have been used for a variety of applications (e.g., sentiment analysis, opinion mining, trend analysis), by gleaning useful information, often using third-party concept extraction tools. There has been very large uptake of such tools in the last few years, along with the creation and adoption of new methods for concept extraction. However, the evaluation of such efforts has been largely consigned to document corpora (e.g. news articles), questioning the suitability of concept extraction tools and methods for Micropost data. This report describes the Making Sense of Microposts Workshop (#MSM2013) Concept Extraction Challenge, hosted in conjunction with the 2013 World Wide Web conference (WWW'13). The Challenge dataset comprised a manually annotated training corpus of Microposts and an unlabelled test corpus. Participants were set the task of engineering a concept extraction system for a defined set of concepts. Out of a total of 22 complete submissions 13 were accepted for presentation at the workshop; the submissions covered methods ranging from sequence mining algorithms for attribute extraction to part-of-speech tagging for Micropost cleaning and rule-based and discriminative models for token classification. In this report we describe the evaluation process and explain the performance of different approaches in different contexts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the past 50 years there has been considerable progress in our understanding of biomolecular interactions at an atomic level. This in turn has allowed molecular simulation methods employing full atomistic modeling at ever larger scales to develop. However, some challenging areas still remain where there is either a lack of atomic resolution structures or where the simulation system is inherently complex. An area where both challenges are present is that of membranes containing membrane proteins. In this review we analyse a new practical approach to membrane protein study that offers a potential new route to high resolution structures and the possibility to simplify simulations. These new approaches collectively recognise that preservation of the interaction between the membrane protein and the lipid bilayer is often essential to maintain structure and function. The new methods preserve these interactions by producing nano-scale disc shaped particles that include bilayer and the chosen protein. Currently two approaches lead in this area: the MSP system that relies on peptides to stabilise the discs, and SMALPs where an amphipathic styrene maleic acid copolymer is used. Both methods greatly enable protein production and hence have the potential to accelerate atomic resolution structure determination as well as providing a simplified format for simulations of membrane protein dynamics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Temporal dynamics of Raman fibre lasers tend to have very complex nature, owing to great cavity lengths and high nonlinearity, being stochastic on short time scales and quasi-continuous on longer time scales. Generally fibre laser intensity dynamics is represented by one-dimensional time-series, which in case of quasi-continuous wave generation in Raman fibre lasers gives little insight into the processes underlying the operation of a laser. New methods of analysis and data representation could help to uncover the underlying physical processes, understand the dynamics or improve the performance of the system. Using intrinsic periodicity of laser radiation, one dimensional intensity time series of a Raman fibre laser was analysed over fast and slow variation time. This allowed to experimentally observe various spatio-temporal regimes of generation, such as laminar, turbulent, partial mode-lock, as well as transitions between them and identify the mechanisms responsible for the transitions. Great cavity length and high nonlinearity also make it difficult to achieve stable high repetition rate mode-locking in Raman fibre lasers. Using Faraday parametric instability in extremely simple linear cavity experimental configuration, a very high order harmonic mode-locking was achieved in ò.ò kmlong Raman fibre laser. The maximum achieved pulse repetition rate was 12 GHz, with 7.3 ps long Gaussian shaped pulses. There is a new type of random lasers – random distributed feedback Raman fibre laser, which temporal properties cannot be controlled by conventionalmode-locking or Q-switch techniques and mechanisms. By adjusting the pump configuration, a very stable pulsed operation of random distributed feedback Raman fibre laser was achieved. Pulse duration varied in the range from 50 to 200 μs depending on the pump power and the cavity length. Pulse repetition rate scaling on the parameters of the system was experimentally identified.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main challenges of multimedia data retrieval lie in the effective mapping between low-level features and high-level concepts, and in the individual users' subjective perceptions of multimedia content. ^ The objectives of this dissertation are to develop an integrated multimedia indexing and retrieval framework with the aim to bridge the gap between semantic concepts and low-level features. To achieve this goal, a set of core techniques have been developed, including image segmentation, content-based image retrieval, object tracking, video indexing, and video event detection. These core techniques are integrated in a systematic way to enable the semantic search for images/videos, and can be tailored to solve the problems in other multimedia related domains. In image retrieval, two new methods of bridging the semantic gap are proposed: (1) for general content-based image retrieval, a stochastic mechanism is utilized to enable the long-term learning of high-level concepts from a set of training data, such as user access frequencies and access patterns of images. (2) In addition to whole-image retrieval, a novel multiple instance learning framework is proposed for object-based image retrieval, by which a user is allowed to more effectively search for images that contain multiple objects of interest. An enhanced image segmentation algorithm is developed to extract the object information from images. This segmentation algorithm is further used in video indexing and retrieval, by which a robust video shot/scene segmentation method is developed based on low-level visual feature comparison, object tracking, and audio analysis. Based on shot boundaries, a novel data mining framework is further proposed to detect events in soccer videos, while fully utilizing the multi-modality features and object information obtained through video shot/scene detection. ^ Another contribution of this dissertation is the potential of the above techniques to be tailored and applied to other multimedia applications. This is demonstrated by their utilization in traffic video surveillance applications. The enhanced image segmentation algorithm, coupled with an adaptive background learning algorithm, improves the performance of vehicle identification. A sophisticated object tracking algorithm is proposed to track individual vehicles, while the spatial and temporal relationships of vehicle objects are modeled by an abstract semantic model. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Beginning in the era of the Spanish conquest and taking the reader right up to the present day, this book focuses on how the landscape of Cuba has changed and evolved into the environment we see today. It illustrates the range of factors – economic, political and cultural – that have determined Cuba’s physical geography, and explores the shifting conservation measures which have been instituted in response to new methods in agriculture and land management. The text uses historical documents, fieldwork, Geographic Information System (GIS) data and remotely-sensed satellite imagery to detail Cuba’s extensive land-use history as well as its potential future. The author goes further to analyze the manner, speed and methods of landscape change, and examines the historical context and governing agendas that have had an impact on the relationship between Cuba’s inhabitants and their island. Gebelein also assesses the key role played by agricultural production in the framework of international trade required to sustain Cuba’s people and its economy. The book concludes with a review of current efforts by Cuban and other research scientists, as well as private investors, conservation managers and university professors who are involved in shaping Cuba’s evolving landscape and managing it during the country’s possible transition to a more politically diverse, enfranchised and open polity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The primary goal of this dissertation is the study of patterns of viral evolution inferred from serially-sampled sequence data, i.e., sequence data obtained from strains isolated at consecutive time points from a single patient or host. RNA viral populations have an extremely high genetic variability, largely due to their astronomical population sizes within host systems, high replication rate, and short generation time. It is this aspect of their evolution that demands special attention and a different approach when studying the evolutionary relationships of serially-sampled sequence data. New methods that analyze serially-sampled data were developed shortly after a groundbreaking HIV-1 study of several patients from which viruses were isolated at recurring intervals over a period of 10 or more years. These methods assume a tree-like evolutionary model, while many RNA viruses have the capacity to exchange genetic material with one another using a process called recombination. ^ A genealogy involving recombination is best described by a network structure. A more general approach was implemented in a new computational tool, Sliding MinPD, one that is mindful of the sampling times of the input sequences and that reconstructs the viral evolutionary relationships in the form of a network structure with implicit representations of recombination events. The underlying network organization reveals unique patterns of viral evolution and could help explain the emergence of disease-associated mutants and drug-resistant strains, with implications for patient prognosis and treatment strategies. In order to comprehensively test the developed methods and to carry out comparison studies with other methods, synthetic data sets are critical. Therefore, appropriate sequence generators were also developed to simulate the evolution of serially-sampled recombinant viruses, new and more through evaluation criteria for recombination detection methods were established, and three major comparison studies were performed. The newly developed tools were also applied to "real" HIV-1 sequence data and it was shown that the results represented within an evolutionary network structure can be interpreted in biologically meaningful ways. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

What is the architecture of transience? What role does architecture play in the impermanent context of the nomad? What form does architecture take when our perception of shelter transforms from fixed and static to flexible and transportable? How does architecture react to the challenges of mobility and change? Traditional building forms speak of stability as an important aspect of architecture. Does portability imply a different building form? During the1950s Buckminister Fuller introduced the idea of mobile, portable structures. In the 1960s Archigrams' examples of architectural nomadism made the mobile home an accepted feature of our contemporary landscape. Currently, new materials and new methods of assembly and transportation open opportunities for rethinking portable architecture. For this thesis, a shelter was developed which provides inhabitable space and portability. The shelter was designed to be easily carried as a backpack. With minimum human effort, the structure is assembled and erected in a few minutes. Although this portable shelter needs to be maneuvered, folded and tucked away for transportation, it does meet the demands of nomadic behavior which emphasizes comfort and portability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The general method for determining organomercurials in environmental and biological samples is gas chromatography with electron capture detection (GC-ECD). However, tedious sample work up protocols and poor chromatographic response show the need for the development of new methods. Here, Atomic Fluorescence-based methods are described, free from these deficiencies. The organomercurials in soil, sediment and tissue samples are first released from the matrices with acidic KBr and cupric ions and extracted into dichloromethane. The initial extracts are subjected to thiosulfate clean up and the organomercury species are isolated as their chloride derivatives by cupric chloride and subsequent extraction into a small volume of dichloromethane. In water samples the organomercurials are pre-concentrated using a sulfhydryl cotton fiber adsorbent, followed by elution with acidic KBr and CuSO 4 and extraction into dichloromethane. Analysis of the organomercurials is accomplished by capillary column chromatography with atomic fluorescence detection.