178 resultados para Leveraged buyouts


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tuberculosis (TB) is a life threatening disease caused due to infection from Mycobacterium tuberculosis (Mtb). That most of the TB strains have become resistant to various existing drugs, development of effective novel drug candidates to combat this disease is a need of the day. In spite of intensive research world-wide, the success rate of discovering a new anti-TB drug is very poor. Therefore, novel drug discovery methods have to be tried. We have used a rule based computational method that utilizes a vertex index, named `distance exponent index (D-x)' (taken x = -4 here) for predicting anti-TB activity of a series of acid alkyl ester derivatives. The method is meant to identify activity related substructures from a series a compounds and predict activity of a compound on that basis. The high degree of successful prediction in the present study suggests that the said method may be useful in discovering effective anti-TB compound. It is also apparent that substructural approaches may be leveraged for wide purposes in computer-aided drug design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: Starting in the 1980s, household-level water treatment and safe storage systems (HWTS) have been developed as simple, local, user-friendly, and low cost options to improve drinking water quality at the point of use. However, despite conclusive evidence of the health and economic benefits of HWTS, and promotion efforts in over 50 countries in the past 20 years, implementation outcomes have been slow, reaching only 5-10 million regular users. This study attempts to understand the barriers and drivers affecting HWTS implementation. Although existing literature related to HWTS and innovation diffusion theories proposed ample critical factors and recommendations, there is a lack of holistic and systemic approach to integrate these findings. It is proposed that system dynamics modelling can be a promising tool to map the inter-relationships of different critical factors and to understand the structure of HWTS dissemination process, which may lead to identifying high impact, leveraged mitigation strategies to scale-up HWTS adoption and sustained use.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Arid and semiarid landscapes comprise nearly a third of the Earth's total land surface. These areas are coming under increasing land use pressures. Despite their low productivity these lands are not barren. Rather, they consist of fragile ecosystems vulnerable to anthropogenic disturbance.

The purpose of this thesis is threefold: (I) to develop and test a process model of wind-driven desertification, (II) to evaluate next-generation process-relevant remote monitoring strategies for use in arid and semiarid regions, and (III) to identify elements for effective management of the world's drylands.

In developing the process model of wind-driven desertification in arid and semiarid lands, field, remote sensing, and modeling observations from a degraded Mojave Desert shrubland are used. This model focuses on aeolian removal and transport of dust, sand, and litter as the primary mechanisms of degradation: killing plants by burial and abrasion, interrupting natural processes of nutrient accumulation, and allowing the loss of soil resources by abiotic transport. This model is tested in field sampling experiments at two sites and is extended by Fourier Transform and geostatistical analysis of high-resolution imagery from one site.

Next, the use of hyperspectral remote sensing data is evaluated as a substantive input to dryland remote monitoring strategies. In particular, the efficacy of spectral mixture analysis (SMA) in discriminating vegetation and soil types and detennining vegetation cover is investigated. The results indicate that hyperspectral data may be less useful than often thought in determining vegetation parameters. Its usefulness in determining soil parameters, however, may be leveraged by developing simple multispectral classification tools that can be used to monitor desertification.

Finally, the elements required for effective monitoring and management of arid and semiarid lands are discussed. Several large-scale multi-site field experiments are proposed to clarify the role of wind as a landscape and degradation process in dry lands. The role of remote sensing in monitoring the world's drylands is discussed in terms of optimal remote sensing platform characteristics and surface phenomena which may be monitored in order to identify areas at risk of desertification. A desertification indicator is proposed that unifies consideration of environmental and human variables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

GAL honetan Arrisku-Kapital Entitateak aztertzen dira. Hau laburki, finantziazio bide alternatibo bat da, zeinaren ezaugarririk garrantzitsuena da, finantziazioaz gain ezagutza iturri ere badirela enpresa-proiektu berritzaileak sustatzeko. Lanaren antolaketari dagokionez, lehenik eta behin, Arrisku Kapital inbertsioaren kontzeptua eta ezaugarriak legalki ekonomikoki zein historikoki aztertu dira, helburua gizartean entitate hauengan dagoen ikuspuntu okerra zuzentzea izan den. Bestetik entitateen tipologia eta legedi aplikagarria aztertu dira 2014. urteko erreforma eta gainontzeko araudia aztertuz. Ondoren alderdi erregulatzailea sakondu da, hau da entitate hauek bete behar dituzten obligazioak eta zeintzuk diren beraien mugak inbertsioak egiteko unean. Geroago, sektorearen erronkak aztertzen dira, gaur egungo egoera eta etorkizunean enpresen finantziazioak ze nolako testuinguru topa dezakegun azalduz. Amaitzeko balorazio bat egin da, bertan garapen historiko zein juridikoa kontuan izanik, sektoreak tamalez hartu duen itxura kritikatu da, batez ere espekulazioari atea ireki diolako eta enpresa txiki eta berritzaileen finantziazio iturri izan zitekeena, areriotasunezko eskurapenak eta espekulazioan oinarrituriko inbertsioak sustatzea ekarri duelako.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O Instituto Superior Tecnológico do Rio de Janeiro (IST-Rio) serviu de lócus para a concretização deste trabalho, que traz o relato de uma realidade institucional vivenciada por uma comunidade educacional que incorporou em seu projeto pedagógico humanidades dentro da perspectiva de uma formação tecnológica. O trabalho foi realizado utilizando como base a Teoria Crítica, aplicada ao universo da educação pública superior e focada nos jovens de classe popular oriundos da Cidade do Rio de Janeiro. A caminhada, turbulenta e ao mesmo tempo repleta de desafios, culminou em conquistas importantes, resultantes do esforço de uma equipe que acreditava nas múltiplas possibilidades de concretização de um projeto comprometido com a formação tecnológica na área de ciência da computação a partir da construção de um currículo integrado que contemplasse as disciplinas da área técnica e não deixasse de privilegiar a formação humanística, tão importante para a vida cidadã. Um projeto que ao mesmo tempo tivesse o olhar pedagógico para a construção de suas práticas e a gerência democrática significativa para a necessária consolidação de um novo estilo de convivência. Os resultados foram significativos e representaram avanços para as vidas dos docentes e discentes, além de profundas mudanças na maneira de consolidar o conhecimento, o que alavancou o desenvolvimento de um sentimento institucional capaz de proporcionar a construção de uma proposta de política pública para a ampliação da formação tecnológica em todo o Estado do Rio de Janeiro, apropriando um projeto inovador, criativo e humano.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tese versa sobre a análise de um dos grandes projetos tecnológicos do Estado nacional, o PEB, com o intuito de verificar em que medida o Brasil, enquanto País em desenvolvimento e inserido no processo de globalização econômica, tem a possibilidade de autodeterminar um projeto nacional de desenvolvimento relativamente soberano e sustentável, mediante sua capacitação tecnológica em áreas de ponta, como as tecnologias espaciais. Neste ínterim, é discutido o processo de institucionalização da ciência no País e a implantação de um moderno sistema de C&T no Brasil através de uma aliança entre cientistas e militares, culminando com a criação do CNPq em 1951. Apresentamos uma releitura da nossa recente história política e os projetos nacionais de desenvolvimento de que foi alvo o País, formulados pelos grupos sociais mais representativos da sociedade na época estudada, recuperando uma discussão que, estendendo-se por décadas, reservou à questão científica um lugar privilegiado no planejamento do Estado. O período da ditadura militar é especialmente contemplado, considerando-se ter sido esta a fase em que realmente o Programa Espacial Brasileiro sofreu maiores investimentos, conferindo aos militares um papel de destaque no quadro de atores sociais coletivos empenhados no projeto de desenvolvimento do País, destacando as diversas correntes ideológicas em ação dentro das Forças Armadas. Foi analisado o processo de globalização devido ao seu nexo interno e externo com as políticas científicas implantadas ou preconizadas no País. Esse processo, alavancado pela nova dinâmica tecnológica internacional iniciada nos anos 1980, estabeleceu profundos impactos e mudanças na constituição atual da esfera do político. Este é o cenário onde, de nosso ponto de vista, inscreve-se a questão da capacitação científico-tecnológica como variável estratégica em todos os níveis das relações internacionais. A compreensão desta problemática deve ser entendida como parte do cenário mundial que se configurou nas últimas décadas do século XX, tendo no entrelaçamento das dinâmicas científico-tecnológica e a soberania nacional dos Estados uma sinergia diferenciada na reordenação geopolítica contemporânea.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

University spin-out (USO) companies play an increasingly important role in generating value from radical, generic technologies, but this translation requires significant resources from other players to reach the market. Seven case studies illuminate how relationships with each type of partner can be leveraged to help the firm create value. We find that most firms in the sample are aware of the importance of corporate partners and actively seek to cultivate these relationships, but may not be taking full advantage of the resources available through nonparent academic institutions and other USOs with similar or complementary technologies. © 2013 The Authors. R&D Management © 2013 Blackwell Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design effcient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. By dealing internally with most of the differential geometry, the package aims particularly at lowering the entrance barrier. © 2014 Nicolas Boumal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless Intrusion Detection Systems (WIDS) monitor 802.11 wireless frames (Layer-2) in an attempt to detect misuse. What distinguishes a WIDS from a traditional Network IDS is the ability to utilize the broadcast nature of the medium to reconstruct the physical location of the offending party, as opposed to its possibly spoofed (MAC addresses) identity in cyber space. Traditional Wireless Network Security Systems are still heavily anchored in the digital plane of "cyber space" and hence cannot be used reliably or effectively to derive the physical identity of an intruder in order to prevent further malicious wireless broadcasts, for example by escorting an intruder off the premises based on physical evidence. In this paper, we argue that Embedded Sensor Networks could be used effectively to bridge the gap between digital and physical security planes, and thus could be leveraged to provide reciprocal benefit to surveillance and security tasks on both planes. Toward that end, we present our recent experience integrating wireless networking security services into the SNBENCH (Sensor Network workBench). The SNBENCH provides an extensible framework that enables the rapid development and automated deployment of Sensor Network applications on a shared, embedded sensing and actuation infrastructure. The SNBENCH's extensible architecture allows an engineer to quickly integrate new sensing and response capabilities into the SNBENCH framework, while high-level languages and compilers allow novice SN programmers to compose SN service logic, unaware of the lower-level implementation details of tools on which their services rely. In this paper we convey the simplicity of the service composition through concrete examples that illustrate the power and potential of Wireless Security Services that span both the physical and digital plane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commonly, research work in routing for delay tolerant networks (DTN) assumes that node encounters are predestined, in the sense that they are the result of unknown, exogenous processes that control the mobility of these nodes. In this paper, we argue that for many applications such an assumption is too restrictive: while the spatio-temporal coordinates of the start and end points of a node's journey are determined by exogenous processes, the specific path that a node may take in space-time, and hence the set of nodes it may encounter could be controlled in such a way so as to improve the performance of DTN routing. To that end, we consider a setting in which each mobile node is governed by a schedule consisting of a ist of locations that the node must visit at particular times. Typically, such schedules exhibit some level of slack, which could be leveraged for DTN message delivery purposes. We define the Mobility Coordination Problem (MCP) for DTNs as follows: Given a set of nodes, each with its own schedule, and a set of messages to be exchanged between these nodes, devise a set of node encounters that minimize message delivery delays while satisfying all node schedules. The MCP for DTNs is general enough that it allows us to model and evaluate some of the existing DTN schemes, including data mules and message ferries. In this paper, we show that MCP for DTNs is NP-hard and propose two detour-based approaches to solve the problem. The first (DMD) is a centralized heuristic that leverages knowledge of the message workload to suggest specific detours to optimize message delivery. The second (DNE) is a distributed heuristic that is oblivious to the message workload, and which selects detours so as to maximize node encounters. We evaluate the performance of these detour-based approaches using extensive simulations based on synthetic workloads as well as real schedules obtained from taxi logs in a major metropolitan area. Our evaluation shows that our centralized, workload-aware DMD approach yields the best performance, in terms of message delay and delivery success ratio, and that our distributed, workload-oblivious DNE approach yields favorable performance when compared to approaches that require the use of data mules and message ferries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The pervasiveness of personal computing platforms offers an unprecedented opportunity to deploy large-scale services that are distributed over wide physical spaces. Two major challenges face the deployment of such services: the often resource-limited nature of these platforms, and the necessity of preserving the autonomy of the owner of these devices. These challenges preclude using centralized control and preclude considering services that are subject to performance guarantees. To that end, this thesis advances a number of new distributed resource management techniques that are shown to be effective in such settings, focusing on two application domains: distributed Field Monitoring Applications (FMAs), and Message Delivery Applications (MDAs). In the context of FMA, this thesis presents two techniques that are well-suited to the fairly limited storage and power resources of autonomously mobile sensor nodes. The first technique relies on amorphous placement of sensory data through the use of novel storage management and sample diffusion techniques. The second approach relies on an information-theoretic framework to optimize local resource management decisions. Both approaches are proactive in that they aim to provide nodes with a view of the monitored field that reflects the characteristics of queries over that field, enabling them to handle more queries locally, and thus reduce communication overheads. Then, this thesis recognizes node mobility as a resource to be leveraged, and in that respect proposes novel mobility coordination techniques for FMAs and MDAs. Assuming that node mobility is governed by a spatio-temporal schedule featuring some slack, this thesis presents novel algorithms of various computational complexities to orchestrate the use of this slack to improve the performance of supported applications. The findings in this thesis, which are supported by analysis and extensive simulations, highlight the importance of two general design principles for distributed systems. First, a-priori knowledge (e.g., about the target phenomena of FMAs and/or the workload of either FMAs or DMAs) could be used effectively for local resource management. Second, judicious leverage and coordination of node mobility could lead to significant performance gains for distributed applications deployed over resource-impoverished infrastructures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Controlling the mobility pattern of mobile nodes (e.g., robots) to monitor a given field is a well-studied problem in sensor networks. In this setup, absolute control over the nodes’ mobility is assumed. Apart from the physical ones, no other constraints are imposed on planning mobility of these nodes. In this paper, we address a more general version of the problem. Specifically, we consider a setting in which mobility of each node is externally constrained by a schedule consisting of a list of locations that the node must visit at particular times. Typically, such schedules exhibit some level of slack, which could be leveraged to achieve a specific coverage distribution of a field. Such a distribution defines the relative importance of different field locations. We define the Constrained Mobility Coordination problem for Preferential Coverage (CMC-PC) as follows: given a field with a desired monitoring distribution, and a number of nodes n, each with its own schedule, we need to coordinate the mobility of the nodes in order to achieve the following two goals: 1) satisfy the schedules of all nodes, and 2) attain the required coverage of the given field. We show that the CMC-PC problem is NP-complete (by reduction to the Hamiltonian Cycle problem). Then we propose TFM, a distributed heuristic to achieve field coverage that is as close as possible to the required coverage distribution. We verify the premise of TFM using extensive simulations, as well as taxi logs from a major metropolitan area. We compare TFM to the random mobility strategy—the latter provides a lower bound on performance. Our results show that TFM is very successful in matching the required field coverage distribution, and that it provides, at least, two-fold query success ratio for queries that follow the target coverage distribution of the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Personal communication devices are increasingly equipped with sensors that are able to collect and locally store information from their environs. The mobility of users carrying such devices, and hence the mobility of sensor readings in space and time, opens new horizons for interesting applications. In particular, we envision a system in which the collective sensing, storage and communication resources, and mobility of these devices could be leveraged to query the state of (possibly remote) neighborhoods. Such queries would have spatio-temporal constraints which must be met for the query answers to be useful. Using a simplified mobility model, we analytically quantify the benefits from cooperation (in terms of the system's ability to satisfy spatio-temporal constraints), which we show to go beyond simple space-time tradeoffs. In managing the limited storage resources of such cooperative systems, the goal should be to minimize the number of unsatisfiable spatio-temporal constraints. We show that Data Centric Storage (DCS), or "directed placement", is a viable approach for achieving this goal, but only when the underlying network is well connected. Alternatively, we propose, "amorphous placement", in which sensory samples are cached locally, and shuffling of cached samples is used to diffuse the sensory data throughout the whole network. We evaluate conditions under which directed versus amorphous placement strategies would be more efficient. These results lead us to propose a hybrid placement strategy, in which the spatio-temporal constraints associated with a sensory data type determine the most appropriate placement strategy for that data type. We perform an extensive simulation study to evaluate the performance of directed, amorphous, and hybrid placement protocols when applied to queries that are subject to timing constraints. Our results show that, directed placement is better for queries with moderately tight deadlines, whereas amorphous placement is better for queries with looser deadlines, and that under most operational conditions, the hybrid technique gives the best compromise.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis covers both the packaging of silicon photonic devices with fiber inputs and outputs as well as the integration of laser light sources with these same devices. The principal challenge in both of these pursuits is coupling light into the submicrometer waveguides that are the hallmark of silicon-on-insulator (SOI) systems. Previous work on grating couplers is leveraged to design new approaches to bridge the gap between the highly-integrated domain of silicon, the Interconnected world of fiber and the active region of III-V materials. First, a novel process for the planar packaging of grating couplers with fibers is explored in detail. This technology allows the creation of easy-to-use test platforms for laser integration and also stands on its own merits as an enabling technology for next-generation silicon photonics systems. The alignment tolerances of this process are shown to be well-suited to a passive alignment process and for wafer-scale assembly. Furthermore, this technology has already been used to package demonstrators for research partners and is included in the offerings of the ePIXfab silicon photonics foundry and as a design kit for PhoeniX Software’s MaskEngineer product. After this, a process for hybridly integrating a discrete edge-emitting laser with a silicon photonic circuit using near-vertical coupling is developed and characterized. The details of the various steps of the design process are given, including mechanical, thermal, optical and electrical steps. The interrelation of these design domains is also discussed. The construction process for a demonstrator is outlined, and measurements are presented of a series of single-wavelength Fabry-Pérot lasers along with a two-section laser tunable in the telecommunications C-band. The suitability and potential of this technology for mass manufacture is demonstrated, with further opportunities for improvement detailed and discussed in the conclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organizations that leverage lessons learned from their experience in the practice of complex real-world activities are faced with five difficult problems. First, how to represent the learning situation in a recognizable way. Second, how to represent what was actually done in terms of repeatable actions. Third, how to assess performance taking account of the particular circumstances. Fourth, how to abstract lessons learned that are re-usable on future occasions. Fifth, how to determine whether to pursue practice maturity or strategic relevance of activities. Here, organizational learning and performance improvement are investigated in a field study using the Context-based Intelligent Assistant Support (CIAS) approach. A new conceptual framework for practice-based organizational learning and performance improvement is presented that supports researchers and practitioners address the problems evoked and contributes to a practice-based approach to activity management. The novelty of the research lies in the simultaneous study of the different levels involved in the activity. Route selection in light rail infrastructure projects involves practices at both the strategic and operational levels; it is part managerial/political and part engineering. Aspectual comparison of practices represented in Contextual Graphs constitutes a new approach to the selection of Key Performance Indicators (KPIs). This approach is free from causality assumptions and forms the basis of a new approach to practice-based organizational learning and performance improvement. The evolution of practices in contextual graphs is shown to be an objective and measurable expression of organizational learning. This diachronic representation is interpreted using a practice-based organizational learning novelty typology. This dissertation shows how lessons learned when effectively leveraged by an organization lead to practice maturity. The practice maturity level of an activity in combination with an assessment of an activity’s strategic relevance can be used by management to prioritize improvement effort.