890 resultados para Urban ecology : patterns, processes and applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. We describe the development, reliability and applications of the Diagnostic Interview for Psychoses (DIP), a comprehensive interview schedule for psychotic disorders. Method. The DIP is intended for use by interviewers with a clinical background and was designed to occupy the middle ground between fully structured, lay-administered schedules, and semi-structured., psychiatrist-administered interviews. It encompasses four main domains: (a) demographic data; (b) social functioning and disability; (c) a diagnostic module comprising symptoms, signs and past history ratings; and (d) patterns of service utilization Lind patient-perceived need for services. It generates diagnoses according to several sets of criteria using the OPCRIT computerized diagnostic algorithm and can be administered either on-screen or in a hard-copy format. Results. The DIP proved easy to use and was well accepted in the field. For the diagnostic module, inter-rater reliability was assessed on 20 cases rated by 24 clinicians: good reliability was demonstrated for both ICD-10 and DSM-III-R diagnoses. Seven cases were interviewed 2-11 weeks apart to determine test-retest reliability, with pairwise agreement of 0.8-1.0 for most items. Diagnostic validity was assessed in 10 cases, interviewed with the DIP and using the SCAN as 'gold standard': in nine cases clinical diagnoses were in agreement. Conclusions. The DIP is suitable for use in large-scale epidemiological studies of psychotic disorders. as well as in smaller Studies where time is at a premium. While the diagnostic module stands on its own, the full DIP schedule, covering demography, social functioning and service utilization makes it a versatile multi-purpose tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We contend that powerful group studies can be conducted using magnetoencephalography (MEG), which can provide useful insights into the approximate distribution of the neural activity detected with MEG without requiring magnetic resonance imaging (MRI) for each participant. Instead, a participant's MRI is approximated with one chosen as a best match on the basis of the scalp surface from a database of available MRIs. Because large inter-individual variability in sulcal and gyral patterns is an inherent source of blurring in studies using grouped functional activity, the additional error introduced by this approximation procedure has little effect on the group results, and offers a sufficiently close approximation to that of the participants to yield a good indication of the true distribution of the grouped neural activity. T1-weighted MRIs of 28 adults were acquired in a variety of MR systems. An artificial functional image was prepared for each person in which eight 5 × 5 × 5 mm regions of brain activation were simulated. Spatial normalisation was applied to each image using transformations calculated using SPM99 with (1) the participant's actual MRI, and (2) the best matched MRI substituted from those of the other 27 participants. The distribution of distances between the locations of points using real and substituted MRIs had a modal value of 6 mm with 90% of cases falling below 12.5 mm. The effects of this -approach on real grouped SAM source imaging of MEG data in a verbal fluency task are also shown. The distribution of MEG activity in the estimated average response is very similar to that produced when using the real MRIs. © 2003 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presented a detailed research work on diamond materials. Chapter 1 is an overall introduction of the thesis. In the Chapter 2, the literature review on the physical, chemical, optical, mechanical, as well as other properties of diamond materials are summarised. Followed by this chapter, several advanced diamond growth and characterisation techniques used in experimental work are also introduced. Then, the successful installation and applications of chemical vapour deposition system was demonstrated in Chapter 4. Diamond growth on a variety of different substrates has been investigated such as on silicon, diamond-like carbon or silica fibres. In Chapter 5, the single crystalline diamond substrate was used as the substrate to perform femtosecond laser inscription. The results proved the potentially feasibility of this technique, which could be utilised in fabricating future biochemistry microfluidic channels on diamond substrates. In Chapter 6, the hydrogen-terminated nanodiamond powder was studied using impedance spectroscopy. Its intrinsic electrical properties and its thermal stability were presented and analysed in details. As the first PhD student within Nanoscience Research Group at Aston, my initial research work was focused on the installation and testing of the microwave plasma enhanced chemical vapour deposition system (MPECVD), which will be beneficial to all the future researchers in the group. The fundamental of the on MPECVD system will be introduced in details. After optimisation of the growth parameters, the uniform diamond deposition has been achieved with a good surface coverage and uniformity. Furthermore, one of the most significant contributions of this work is the successful pattern inscription on diamond substrates by femtosecond laser system. Previous research of femtosecond laser inscription on diamond was simple lines or dots, with little characterisation techniques were used. In my research work, the femtosecond laser has been successfully used to inscribe patterns on diamond substrate and fully characterisation techniques, e.g. by SEM, Raman, XPS, as well as AFM, have been carried out. After the femtosecond laser inscription, the depth of microfluidic channels on diamond film has been found to be 300~400 nm, with a graphitic layer thickness of 165~190 nm. Another important outcome of this work is the first time to characterise the electrical properties of hydrogenterminated nanodiamond with impedance spectroscopy. Based on the experimental evaluation and mathematic fitting, the resistance of hydrogen-terminated nanodiamond reduced to 0.25 MO, which were four orders of magnitude lower than untreated nanodiamond. Meanwhile, a theoretical equivalent circuit has been proposed to fit the results. Furthermore, the hydrogenterminated nanodiamond samples were annealed at different temperature to study its thermal stability. The XPS and FTIR results indicate that hydrogen-terminated nanodiamond will start to oxidize over 100ºC and the C-H bonds can survive up to 400ºC. This research work reports the fundamental electrical properties of hydrogen-terminated nanodiamond, which can be used in future applications in physical or chemical area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: Primary 60J80; Secondary 92D30.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clusters are aggregations of atoms or molecules, generally intermediate in size between individual atoms and aggregates that are large enough to be called bulk matter. Clusters can also be called nanoparticles, because their size is on the order of nanometers or tens of nanometers. A new field has begun to take shape called nanostructured materials which takes advantage of these atom clusters. The ultra-small size of building blocks leads to dramatically different properties and it is anticipated that such atomically engineered materials will be able to be tailored to perform as no previous material could.^ The idea of ionized cluster beam (ICB) thin film deposition technique was first proposed by Takagi in 1972. It was based upon using a supersonic jet source to produce, ionize and accelerate beams of atomic clusters onto substrates in a vacuum environment. Conditions for formation of cluster beams suitable for thin film deposition have only recently been established following twenty years of effort. Zinc clusters over 1,000 atoms in average size have been synthesized both in our lab and that of Gspann. More recently, other methods of synthesizing clusters and nanoparticles, using different types of cluster sources, have come under development.^ In this work, we studied different aspects of nanoparticle beams. The work includes refinement of a model of the cluster formation mechanism, development of a new real-time, in situ cluster size measurement method, and study of the use of ICB in the fabrication of semiconductor devices.^ The formation process of the vaporized-metal cluster beam was simulated and investigated using classical nucleation theory and one dimensional gas flow equations. Zinc cluster sizes predicted at the nozzle exit are in good quantitative agreement with experimental results in our laboratory.^ A novel in situ real-time mass, energy and velocity measurement apparatus has been designed, built and tested. This small size time-of-flight mass spectrometer is suitable to be used in our cluster deposition systems and does not suffer from problems related to other methods of cluster size measurement like: requirement for specialized ionizing lasers, inductive electrical or electromagnetic coupling, dependency on the assumption of homogeneous nucleation, limits on the size measurement and non real-time capability. Measured ion energies using the electrostatic energy analyzer are in good accordance with values obtained from computer simulation. The velocity (v) is measured by pulsing the cluster beam and measuring the time of delay between the pulse and analyzer output current. The mass of a particle is calculated from m = (2E/v$\sp2).$ The error in the measured value of background gas mass is on the order of 28% of the mass of one N$\sb2$ molecule which is negligible for the measurement of large size clusters. This resolution in cluster size measurement is very acceptable for our purposes.^ Selective area deposition onto conducting patterns overlying insulating substrates was demonstrated using intense, fully-ionized cluster beams. Parameters influencing the selectivity are ion energy, repelling voltage, the ratio of the conductor to insulator dimension, and substrate thickness. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personalized recommender systems aim to assist users in retrieving and accessing interesting items by automatically acquiring user preferences from the historical data and matching items with the preferences. In the last decade, recommendation services have gained great attention due to the problem of information overload. However, despite recent advances of personalization techniques, several critical issues in modern recommender systems have not been well studied. These issues include: (1) understanding the accessing patterns of users (i.e., how to effectively model users' accessing behaviors); (2) understanding the relations between users and other objects (i.e., how to comprehensively assess the complex correlations between users and entities in recommender systems); and (3) understanding the interest change of users (i.e., how to adaptively capture users' preference drift over time). To meet the needs of users in modern recommender systems, it is imperative to provide solutions to address the aforementioned issues and apply the solutions to real-world applications. ^ The major goal of this dissertation is to provide integrated recommendation approaches to tackle the challenges of the current generation of recommender systems. In particular, three user-oriented aspects of recommendation techniques were studied, including understanding accessing patterns, understanding complex relations and understanding temporal dynamics. To this end, we made three research contributions. First, we presented various personalized user profiling algorithms to capture click behaviors of users from both coarse- and fine-grained granularities; second, we proposed graph-based recommendation models to describe the complex correlations in a recommender system; third, we studied temporal recommendation approaches in order to capture the preference changes of users, by considering both long-term and short-term user profiles. In addition, a versatile recommendation framework was proposed, in which the proposed recommendation techniques were seamlessly integrated. Different evaluation criteria were implemented in this framework for evaluating recommendation techniques in real-world recommendation applications. ^ In summary, the frequent changes of user interests and item repository lead to a series of user-centric challenges that are not well addressed in the current generation of recommender systems. My work proposed reasonable solutions to these challenges and provided insights on how to address these challenges using a simple yet effective recommendation framework.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large series of laboratory ice crushing experiments was performed to investigate the effects of external boundary condition and indenter contact geometry on ice load magnitude under crushing conditions. Four boundary conditions were considered: dry cases, submerged cases, and cases with the presence of snow and granular ice material on the indenter surface. Indenter geometries were a flat plate, wedge shaped indenter, (reverse) conical indenter, and spherical indenter. These were impacted with artificially produced ice specimens of conical shape with 20° and 30° cone angles. All indenter – ice combinations were tested in dry and submerged environments at 1 mm/s and 100 mm/s indentation rates. Additional tests with the flat indentation plate were conducted at 10 mm/s impact velocity and a subset of scenarios with snow and granular ice material was evaluated. The tests were performed using a material testing system (MTS) machine located inside a cold room at an ambient temperature of - 7°C. Data acquisition comprised time, vertical force, and displacement. In several tests with the flat plate and wedge shaped indenter, supplementary information on local pressure patterns and contact area were obtained using tactile pressure sensors. All tests were recorded with a high speed video camera and still photos were taken before and after each test. Thin sections were taken of some specimens as well. Ice loads were found to strongly depend on contact condition, interrelated with pre-existing confinement and indentation rate. Submergence yielded higher forces, especially at the high indentation rate. This was very evident for the flat indentation plate and spherical indenter, and with restrictions for the wedge shaped indenter. No indication was found for the conical indenter. For the conical indenter it was concluded that the structural restriction due to the indenter geometry was dominating. The working surface for the water to act was not sufficient to influence the failure processes and associated ice loads. The presence of snow and granular ice significantly increased the forces at the low indentation rate (with the flat indentation plate) that were higher compared to submerged cases and far above the dry contact condition. Contact area measurements revealed a correlation of higher forces with a concurrent increase in actual contact area that depended on the respective boundary condition. In submergence, ice debris constitution was changed; ice extrusion, as well as crack development and propagation were impeded. Snow and granular ice seemed to provide additional material sources for establishing larger contact areas. The dry contact condition generally had the smallest real contact area, as well as the lowest forces. The comparison of nominal and measured contact areas revealed distinct deviations. The incorporation of those differences in contact process pressures-area relationships indicated that the overall process pressure was not substantially affected by the increased loads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While molecular and cellular processes are often modeled as stochastic processes, such as Brownian motion, chemical reaction networks and gene regulatory networks, there are few attempts to program a molecular-scale process to physically implement stochastic processes. DNA has been used as a substrate for programming molecular interactions, but its applications are restricted to deterministic functions and unfavorable properties such as slow processing, thermal annealing, aqueous solvents and difficult readout limit them to proof-of-concept purposes. To date, whether there exists a molecular process that can be programmed to implement stochastic processes for practical applications remains unknown.

In this dissertation, a fully specified Resonance Energy Transfer (RET) network between chromophores is accurately fabricated via DNA self-assembly, and the exciton dynamics in the RET network physically implement a stochastic process, specifically a continuous-time Markov chain (CTMC), which has a direct mapping to the physical geometry of the chromophore network. Excited by a light source, a RET network generates random samples in the temporal domain in the form of fluorescence photons which can be detected by a photon detector. The intrinsic sampling distribution of a RET network is derived as a phase-type distribution configured by its CTMC model. The conclusion is that the exciton dynamics in a RET network implement a general and important class of stochastic processes that can be directly and accurately programmed and used for practical applications of photonics and optoelectronics. Different approaches to using RET networks exist with vast potential applications. As an entropy source that can directly generate samples from virtually arbitrary distributions, RET networks can benefit applications that rely on generating random samples such as 1) fluorescent taggants and 2) stochastic computing.

By using RET networks between chromophores to implement fluorescent taggants with temporally coded signatures, the taggant design is not constrained by resolvable dyes and has a significantly larger coding capacity than spectrally or lifetime coded fluorescent taggants. Meanwhile, the taggant detection process becomes highly efficient, and the Maximum Likelihood Estimation (MLE) based taggant identification guarantees high accuracy even with only a few hundred detected photons.

Meanwhile, RET-based sampling units (RSU) can be constructed to accelerate probabilistic algorithms for wide applications in machine learning and data analytics. Because probabilistic algorithms often rely on iteratively sampling from parameterized distributions, they can be inefficient in practice on the deterministic hardware traditional computers use, especially for high-dimensional and complex problems. As an efficient universal sampling unit, the proposed RSU can be integrated into a processor / GPU as specialized functional units or organized as a discrete accelerator to bring substantial speedups and power savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Human societies are reliant on the functioning of the hydrologic cycle. The atmospheric branch of this cycle, often referred to as moisture recycling in the context of land-to-land exchange, refers to water evaporating, traveling through the atmosphere, and falling out as precipitation. Similar to the surface water cycle that uses the watershed as the unit of analysis, it is also possible to consider a ‘watershed of the sky’ for the atmospheric water cycle. Thus, I explore the precipitationshed - defined as the upwind surface of the Earth that provides evaporation that later falls as precipitation in a specific place. The primary contributions of this dissertation are to (a) introduce the precipitationshed concept, (b) provide a quantitative basis for the study of the precipitationshed, and (c) demonstrate its use in the fields of hydrometeorology, land-use change, social-ecological systems, ecosystem services, and environmental governance. In Paper I, the concept of the precipitationshed is introduced and explored for the first time. The quantification of precipitationshed variability is described in Paper II, and the key finding is that the precipitationsheds for multiple regions are persistent in time and space. Moisture recycling is further described as an ecosystem service in Paper III, to integrate the concept into the existing language of environmental sustainability and management. That is, I identify regions where vegetation more strongly regulates the provision of atmospheric water, as well as the regions that more strongly benefit from this regulation. In Paper IV, the precipitationshed is further explored through the lens of urban reliance on moisture recycling. Using a novel method, I quantify the vulnerability of urban areas to social-ecological changes within their precipitationsheds. In Paper V, I argue that successful moisture recycling governance will require flexible, transboundary institutions that are capable of operating within complex social-ecological systems. I conclude that, in the future, the precipitationshed can be a key tool in addressing the complexity of social-ecological systems. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban areas such as megacities (those with populations greater than 10 million) are hotspots of global water use and thus face intense water management challenges. Urban areas are influenced by local interactions between human and natural systems and interact with distant systems through flows of water, food, energy, people, information, and capital. However, analyses of water sustainability and the management of water flows in urban areas are often fragmented. There is a strong need to apply integrated frameworks to systematically analyze urban water dynamics and factors that influence these dynamics. We apply the framework of telecoupling (socioeconomic and environmental interactions over distances) to analyze urban water issues, using Beijing as a demonstration megacity. Beijing exemplifies the global water sustainability challenge for urban settings. Like many other cities, Beijing has experienced drastic reductions in quantity and quality of both surface water and groundwater over the past several decades; it relies on the import of real and virtual water from sending systems to meet its demand for clean water, and releases polluted water to other systems (spillover systems). The integrative framework we present demonstrates the importance of considering socioeconomic and environmental interactions across telecoupled human and natural systems, which include not only Beijing (the water-receiving system) but also water-sending systems and spillover systems. This framework helps integrate important components of local and distant human–nature interactions and incorporates a wide range of local couplings and telecouplings that affect water dynamics, which in turn generate significant socioeconomic and environmental consequences, including feedback effects. The application of the framework to Beijing reveals many research gaps and management needs. We also provide a foundation to apply the telecoupling framework to better understand and manage water sustainability in other cities around the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The overwhelming amount and unprecedented speed of publication in the biomedical domain make it difficult for life science researchers to acquire and maintain a broad view of the field and gather all information that would be relevant for their research. As a response to this problem, the BioNLP (Biomedical Natural Language Processing) community of researches has emerged and strives to assist life science researchers by developing modern natural language processing (NLP), information extraction (IE) and information retrieval (IR) methods that can be applied at large-scale, to scan the whole publicly available biomedical literature and extract and aggregate the information found within, while automatically normalizing the variability of natural language statements. Among different tasks, biomedical event extraction has received much attention within BioNLP community recently. Biomedical event extraction constitutes the identification of biological processes and interactions described in biomedical literature, and their representation as a set of recursive event structures. The 2009–2013 series of BioNLP Shared Tasks on Event Extraction have given raise to a number of event extraction systems, several of which have been applied at a large scale (the full set of PubMed abstracts and PubMed Central Open Access full text articles), leading to creation of massive biomedical event databases, each of which containing millions of events. Sinece top-ranking event extraction systems are based on machine-learning approach and are trained on the narrow-domain, carefully selected Shared Task training data, their performance drops when being faced with the topically highly varied PubMed and PubMed Central documents. Specifically, false-positive predictions by these systems lead to generation of incorrect biomolecular events which are spotted by the end-users. This thesis proposes a novel post-processing approach, utilizing a combination of supervised and unsupervised learning techniques, that can automatically identify and filter out a considerable proportion of incorrect events from large-scale event databases, thus increasing the general credibility of those databases. The second part of this thesis is dedicated to a system we developed for hypothesis generation from large-scale event databases, which is able to discover novel biomolecular interactions among genes/gene-products. We cast the hypothesis generation problem as a supervised network topology prediction, i.e predicting new edges in the network, as well as types and directions for these edges, utilizing a set of features that can be extracted from large biomedical event networks. Routine machine learning evaluation results, as well as manual evaluation results suggest that the problem is indeed learnable. This work won the Best Paper Award in The 5th International Symposium on Languages in Biology and Medicine (LBM 2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main drivers for the development and evolution of Cyber Physical Systems (CPS) are the reduction of development costs and time along with the enhancement of the designed products. The aim of this survey paper is to provide an overview of different types of system and the associated transition process from mechatronics to CPS and cloud-based (IoT) systems. It will further consider the requirement that methodologies for CPS-design should be part of a multi-disciplinary development process within which designers should focus not only on the separate physical and computational components, but also on their integration and interaction. Challenges related to CPS-design are therefore considered in the paper from the perspectives of the physical processes, computation and integration respectively. Illustrative case studies are selected from different system levels starting with the description of the overlaying concept of Cyber Physical Production Systems (CPPSs). The analysis and evaluation of the specific properties of a sub-system using a condition monitoring system, important for the maintenance purposes, is then given for a wind turbine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the prevalence of smartphones, new ways of engaging citizens and stakeholders in urban planning and govern-ance are emerging. The technologies in smartphones allow citizens to act as sensors of their environment, producing and sharing rich spatial data useful for new types of collaborative governance set-ups. Data derived from Volunteered Geographic Information (VGI) can support accessible, transparent, democratic, inclusive, and locally-based governance situations of interest to planners, citizens, politicians, and scientists. However, there are still uncertainties about how to actually conduct this in practice. This study explores how social media VGI can be used to document spatial tendencies regarding citizens’ uses and perceptions of urban nature with relevance for urban green space governance. Via the hashtag #sharingcph, created by the City of Copenhagen in 2014, VGI data consisting of geo-referenced images were collected from Instagram, categorised according to their content and analysed according to their spatial distribution patterns. The results show specific spatial distributions of the images and main hotspots. Many possibilities and much potential of using VGI for generating, sharing, visualising and communicating knowledge about citizens’ spatial uses and preferences exist, but as a tool to support scientific and democratic interaction, VGI data is challenged by practical, technical and ethical concerns. More research is needed in order to better understand the usefulness and application of this rich data source to governance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We assessed the genetic structure of populations of the widely distributed sea cucumber Holothuria (Holothuria) mammata Grube, 1840, and investigated the effects of marine barriers to gene flow and historical processes. Several potential genetic breaks were considered, which would separate the Atlantic and Mediterranean basins, the isolated Macaronesian Islands from the other locations analysed, and the Western Mediterranean and Aegean Sea (Eastern Mediterranean). We analysed mitochondrial 16S and COI gene sequences from 177 individuals from four Atlantic locations and four Mediterranean locations. Haplotype diversity was high (H = 0.9307 for 16S and 0.9203 for COI), and the haplotypes were closely related (p = 0.0058 for 16S and 0.0071 for COI). The lowest genetic diversities were found in the Aegean Sea population. Our results showed that the COI gene was more variable and more useful for the detection of population structure than the 16S gene. The distribution of mtDNA haplotypes, the pairwise FST values and the results of exact tests and AMOVA revealed: (i) a significant genetic break between the population in the Aegean Sea and those in the other locations, as supported by both mitochondrial genes, and (ii) weak differentiation of the Canary and Azores Islands from the other populations; however, the populations from the Macaronesian Islands, Algarve and West Mediterranean could be considered to be a panmictic metapopulation. Isolation by distance was not identified in H. (H.) mammata. Historical events behind the observed findings, together with the current oceanographic patterns, were proposed and discussed as the main factors that determine the population structure and genetic signature of H. (H.) mammata