862 resultados para COMBINATORIAL TECHNOLOGIES
Resumo:
In the principles-and-parameters model of language, the principle known as "free indexation'' plays an important part in determining the referential properties of elements such as anaphors and pronominals. This paper addresses two issues. (1) We investigate the combinatorics of free indexation. In particular, we show that free indexation must produce an exponential number of referentially distinct structures. (2) We introduce a compositional free indexation algorithm. We prove that the algorithm is "optimal.'' More precisely, by relating the compositional structure of the formulation to the combinatorial analysis, we show that the algorithm enumerates precisely all possible indexings, without duplicates.
Resumo:
Norris, G. & Wilson, P., 'Crime Prevention and New Technologies: The Special Case of CCTV', In: Issues in Australian Crime and Criminal Justice, Lexis-Nexis, pp.409-418, 2005. RAE2008
Resumo:
Recenzje i sprawozdania z książek
Resumo:
uma revisão sobre alguns estudos epidemiológicos melhorará a compreensão quanto aos efeitos potenciais na saúde da gestão de resíduos e permitirá obter informação importante em termos de trabalho futuro. Alguns estudos mostraram associações significativas entre diversos métodos de gestão de resíduos e impactos potenciais na saúde humana. noutros estudos as associações foram consideradas inconsistentes ou susceptíveis de conduzir a equívocos, sendo necessários mais estudos epidemiológicos para averiguar as consequências para a saúde humana e para determinar os seus efeitos toxicológicos directos, assegurando assim que a gestão de resíduos representa um risco mínimo para a saúde. os estudos epidemiológicos devem ser analisados com uma mente aberta, tomando em consideração factores como o estatuto social e a migração das populações. A review of a few epidemiologic studies will improve the understanding of the potential health effects of waste management and will provide important information regarding future work. several studies showed significant relationships between several methods of waste management and potential impacts on human health. in other studies associations were found to be inconsistent or equivocal and more specific epidemiological studies must be performed to assess consequences to human health and to determine their direct toxicological effects, thus ensuring that waste management pose minimum risk to health.
Resumo:
Identification of common sub-sequences for a group of functionally related DNA sequences can shed light on the role of such elements in cell-specific gene expression. In the megakaryocytic lineage, no one single unique transcription factor was described as linage specific, raising the possibility that a cluster of gene promoter sequences presents a unique signature. Here, the megakaryocytic gene promoter group, which consists of both human and mouse 5' non-coding regions, served as a case study. A methodology for group-combinatorial search has been implemented as a customized software platform. It extracts the longest common sequences for a group of related DNA sequences and allows for single gaps of varying length, as well as double- and multiple-gap sequences. The results point to common DNA sequences in a group of genes that is selectively expressed in megakaryocytes, and which does not appear in a large group of control, random and specific sequences. This suggests a role for a combination of these sequences in cell-specific gene expression in the megakaryocytic lineage. The data also point to an intrinsic cross-species difference in the organization of 5' non-coding sequences within the mammalian genomes. This methodology may be used for the identification of regulatory sequences in other lineages.
Resumo:
In this paper, we study the efficacy of genetic algorithms in the context of combinatorial optimization. In particular, we isolate the effects of cross-over, treated as the central component of genetic search. We show that for problems of nontrivial size and difficulty, the contribution of cross-over search is marginal, both synergistically when run in conjunction with mutation and selection, or when run with selection alone, the reference point being the search procedure consisting of just mutation and selection. The latter can be viewed as another manifestation of the Metropolis process. Considering the high computational cost of maintaining a population to facilitate cross-over search, its marginal benefit renders genetic search inferior to its singleton-population counterpart, the Metropolis process, and by extension, simulated annealing. This is further compounded by the fact that many problems arising in practice may inherently require a large number of state transitions for a near-optimal solution to be found, making genetic search infeasible given the high cost of computing a single iteration in the enlarged state-space.
Resumo:
Understanding and modeling the factors that underlie the growth and evolution of network topologies are basic questions that impact capacity planning, forecasting, and protocol research. Early topology generation work focused on generating network-wide connectivity maps, either at the AS-level or the router-level, typically with an eye towards reproducing abstract properties of observed topologies. But recently, advocates of an alternative "first-principles" approach question the feasibility of realizing representative topologies with simple generative models that do not explicitly incorporate real-world constraints, such as the relative costs of router configurations, into the model. Our work synthesizes these two lines by designing a topology generation mechanism that incorporates first-principles constraints. Our goal is more modest than that of constructing an Internet-wide topology: we aim to generate representative topologies for single ISPs. However, our methods also go well beyond previous work, as we annotate these topologies with representative capacity and latency information. Taking only demand for network services over a given region as input, we propose a natural cost model for building and interconnecting PoPs and formulate the resulting optimization problem faced by an ISP. We devise hill-climbing heuristics for this problem and demonstrate that the solutions we obtain are quantitatively similar to those in measured router-level ISP topologies, with respect to both topological properties and fault-tolerance.
Resumo:
The combinatorial Dirichlet problem is formulated, and an algorithm for solving it is presented. This provides an effective method for interpolating missing data on weighted graphs of arbitrary connectivity. Image processing examples are shown, and the relation to anistropic diffusion is discussed.
Resumo:
Advanced sensory systems address a number of major obstacles towards the provision for cost effective and proactive rehabilitation. Many of these systems employ technologies such as high-speed video or motion capture to generate quantitative measurements. However these solutions are accompanied by some major limitations including extensive set-up and calibration, restriction to indoor use, high cost and time consuming data analysis. Additionally many do not quantify improvement in a rigorous manner for example gait analysis for 5 minutes as opposed to 24 hour ambulatory monitoring. This work addresses these limitations using low cost, wearable wireless inertial measurement as a mobile and minimal infrastructure alternative. In cooperation with healthcare professionals the goal is to design and implement a reconfigurable and intelligent movement capture system. A key component of this work is an extensive benchmark comparison with the 'gold standard' VICON motion capture system.
Resumo:
A comparison study was carried out between a wireless sensor node with a bare die flip-chip mounted and its reference board with a BGA packaged transceiver chip. The main focus is the return loss (S parameter S11) at the antenna connector, which was highly depended on the impedance mismatch. Modeling including the different interconnect technologies, substrate properties and passive components, was performed to simulate the system in Ansoft Designer software. Statistical methods, such as the use of standard derivation and regression, were applied to the RF performance analysis, to see the impacts of the different parameters on the return loss. Extreme value search, following on the previous analysis, can provide the parameters' values for the minimum return loss. Measurements fit the analysis and simulation well and showed a great improvement of the return loss from -5dB to -25dB for the target wireless sensor node.
Resumo:
An aim of proactive risk management strategies is the timely identification of safety related risks. One way to achieve this is by deploying early warning systems. Early warning systems aim to provide useful information on the presence of potential threats to the system, the level of vulnerability of a system, or both of these, in a timely manner. This information can then be used to take proactive safety measures. The United Nation’s has recommended that any early warning system need to have four essential elements, which are the risk knowledge element, a monitoring and warning service, dissemination and communication and a response capability. This research deals with the risk knowledge element of an early warning system. The risk knowledge element of an early warning system contains models of possible accident scenarios. These accident scenarios are created by using hazard analysis techniques, which are categorised as traditional and contemporary. The assumption in traditional hazard analysis techniques is that accidents are occurred due to a sequence of events, whereas, the assumption of contemporary hazard analysis techniques is that safety is an emergent property of complex systems. The problem is that there is no availability of a software editor which can be used by analysts to create models of accident scenarios based on contemporary hazard analysis techniques and generate computer code that represent the models at the same time. This research aims to enhance the process of generating computer code based on graphical models that associate early warning signs and causal factors to a hazard, based on contemporary hazard analyses techniques. For this purpose, the thesis investigates the use of Domain Specific Modeling (DSM) technologies. The contributions of this thesis is the design and development of a set of three graphical Domain Specific Modeling languages (DSML)s, that when combined together, provide all of the necessary constructs that will enable safety experts and practitioners to conduct hazard and early warning analysis based on a contemporary hazard analysis approach. The languages represent those elements and relations necessary to define accident scenarios and their associated early warning signs. The three DSMLs were incorporated in to a prototype software editor that enables safety scientists and practitioners to create and edit hazard and early warning analysis models in a usable manner and as a result to generate executable code automatically. This research proves that the DSM technologies can be used to develop a set of three DSMLs which can allow user to conduct hazard and early warning analysis in more usable manner. Furthermore, the three DSMLs and their dedicated editor, which are presented in this thesis, may provide a significant enhancement to the process of creating the risk knowledge element of computer based early warning systems.
Resumo:
Drug delivery systems influence the various processes of release, absorption, distribution and elimination of drug. Conventional delivery methods administer drug through the mouth, the skin, transmucosal areas, inhalation or injection. However, one of the current challenges is the lack of effective and targeted oral drug administration. Development of sophisticated strategies, such as micro- and nanotechnology that can integrate the design and synthesis of drug delivery systems in a one-step, scalable process is fundamental in advancing the limitations of conventional processing techniques. Thus, the objective of this thesis is to evaluate novel microencapsulation technologies in the production of size-specific and target-specific drug-loaded particles. The first part of this thesis describes the utility of PDMS and silicon microfluidic flow focusing devices (MFFDs) to produce PLGA-based microparticles. The formation of uniform droplets was dependent on the surface of PDMS remaining hydrophilic. However, the durability of PDMS was limited to no more than 1 hour before wetting of the microchannel walls with dichloromethane and subsequent swelling occurred. Critically, silicon MFFDs revealed very good solvent compatibility and was sufficiently robust to withstand elevated fluid flow rates. Silicon MFFDs facilitated experiments to run over days with continuous use and re-use of the device with a narrower microparticle size distribution, relative to conventional production techniques. The second part of this thesis demonstrates an alternative microencapsulation technology, SmPill® minispheres, to target CsA delivery to the colon. Characterisation of CsA release in vitro and in vivo was performed. By modulating the ethylcellulose:pectin coating thickness, release of CsA in-vivo was more effectively controlled compared to current commercial CsA formulations and demonstrated a linear in-vitro in-vivo relationship. Coated minispheres were shown to limit CsA release in the upper small intestine and enhance localised CsA delivery to the colon.
Resumo:
In order to widely use Ge and III-V materials instead of Si in advanced CMOS technology, the process and integration of these materials has to be well established so that their high mobility benefit is not swamped by imperfect manufacturing procedures. In this dissertation number of key bottlenecks in realization of Ge devices are investigated; We address the challenge of the formation of low resistivity contacts on n-type Ge, comparing conventional and advanced rapid thermal annealing (RTA) and laser thermal annealing (LTA) techniques respectively. LTA appears to be a feasible approach for realization of low resistivity contacts with an incredibly sharp germanide-substrate interface and contact resistivity in the order of 10 -7 Ω.cm2. Furthermore the influence of RTA and LTA on dopant activation and leakage current suppression in n+/p Ge junction were compared. Providing very high active carrier concentration > 1020 cm-3, LTA resulted in higher leakage current compared to RTA which provided lower carrier concentration ~1019 cm-3. This is an indication of a trade-off between high activation level and junction leakage current. High ION/IOFF ratio ~ 107 was obtained, which to the best of our knowledge is the best reported value for n-type Ge so far. Simulations were carried out to investigate how target sputtering, dose retention, and damage formation is generated in thin-body semiconductors by means of energetic ion impacts and how they are dependent on the target physical material properties. Solid phase epitaxy studies in wide and thin Ge fins confirmed the formation of twin boundary defects and random nucleation growth, like in Si, but here 600 °C annealing temperature was found to be effective to reduce these defects. Finally, a non-destructive doping technique was successfully implemented to dope Ge nanowires, where nanowire resistivity was reduced by 5 orders of magnitude using PH3 based in-diffusion process.
Resumo:
This qualitative research expands understanding of how information about a range of Novel Food Technologies (NFTs) is used and assimilated, and the implications of this on the evolution of attitudes and acceptance. This work enhances theoretical and applied understanding of citizens’ evaluative processes around these technologies. The approach applied involved observations of interactive exchanges between citizens and information providers (i.e. food scientists), during which they discussed a specific technology. This flexible, yet structured, approach revealed how individuals construct meaning around information about specific NFTs. A rich dataset of 42 ‘deliberate discourse’ and 42 postdiscourse transcripts was collected. Data analysis encompassed three stages: an initial descriptive account of the complete dataset based on the top-down bottom-up (TDBU) model of attitude formation, followed by inductive and deductive thematic analysis across the selected technology groups. The hybrid thematic analysis undertaken identified a Conceptual Model, which represents a holistic perspective on the influences and associated features directing ‘sense-making’ and ultimate evaluations around the technology clusters. How individuals make sense of these technologies is shaped by: their beliefs, values and personal characteristics; their perceptions of power and control over the application of the technology; and, the assumed relevance of the technology and its applications within different contexts. These influences form the frame for the creation of sense-making around the technologies. Internal negotiations between these influences are evident and evaluations are based on the relative importance of each influence to the individual, which tend to contribute to attitude ambivalence and instability. The findings indicate the processes of forming and changing attitudes towards these technologies are: complex; dependent on characteristics of the individual, technology, application and product; and, impacted by the nature and forms of information provided. Challenges are faced in engaging with the public about these technologies, as levels of knowledge, understanding and interest vary.
Resumo:
The combinatorial model of nuclear level densities has now reached a level of accuracy comparable to that of the best global analytical expressions without suffering from the limits imposed by the statistical hypothesis on which the latter expressions rely. In particular, it provides, naturally, non-Gaussian spin distribution as well as non-equipartition of parities which are known to have an impact on cross section predictions at low energies [1, 2, 3]. Our previous global models developed in Refs. [1, 2] suffered from deficiencies, in particular in the way the collective effects - both vibrational and rotational - were treated. We have recently improved this treatment using simultaneously the single-particle levels and collective properties predicted by a newly derived Gogny interaction [4], therefore enabling a microscopic description of energy-dependent shell, pairing and deformation effects. In addition for deformed nuclei, the transition to sphericity is coherently taken into account on the basis of a temperature-dependent Hartree-Fock calculation which provides at each temperature the structure properties needed to build the level densities. This new method is described and shown to give promising results with respect to available experimental data.