928 resultados para multi-way analysis
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
Dissertation presented to obtain the Ph.D degree in Biochemistry, Structural Biochemistry
Resumo:
An energy harvesting system requires an energy storing device to store the energy retrieved from the surrounding environment. This can either be a rechargeable battery or a supercapcitor. Due to the limited lifetime of rechargeable batteries, they need to be periodically replaced. Therefore, a supercapacitor, which has ideally a limitless number of charge/discharge cycles can be used to store the energy; however, a voltage regulator is required to obtain a constant output voltage as the supercapacitor discharges. This can be implemented by a Switched-Capacitor DC-DC converter which allows a complete integration in CMOS technology, although it requires several topologies in order to obtain a high efficiency. This thesis presents the complete analysis of four different topologies in order to determine expressions that allow to design and determine the optimum input voltage ranges for each topology. To better understand the parasitic effects, the implementation of the capacitors and the non-ideal effect of the switches, in 130 nm technology, were carefully studied. With these two analysis a multi-ratio SC DC-DC converter was designed with an output power of 2 mW, maximum efficiency of 77%, and a maximum output ripple, in the steady state, of 23 mV; for an input voltage swing of 2.3 V to 0.85 V. This proposed converter has four operation states that perform the conversion ratios of 1/2, 2/3, 1/1 and 3/2 and its clock frequency is automatically adjusted to produce a stable output voltage of 1 V. These features are implemented through two distinct controller circuits that use asynchronous time machines (ASM) to dynamically adjust the clock frequency and to select the active state of the converter. All the theoretical expressions as well as the behaviour of the whole system was verified using electrical simulations.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
We report on the growth and structural and morphologic characterization of stacked layers of self-assembled GeSn dots grown on Si (100) substrates by molecular beam epitaxy at low substrate temperature T = 350 °C. Samples consist of layers (from 1 up to 10) of Ge0.96Sn0.04 self-assembled dots separated by Si spacer layers, 10 nm thick. Their structural analysis was performed based on transmission electron microscopy, atomic force microscopy and Raman scattering. We found that up to 4 stacks of dots could be grown with good dot layer homogeneity, making the GeSn dots interesting candidates for optoelectronic device applications.
Resumo:
Tese de Doutoramento em Ciência Política e Relações Internacionais
Resumo:
Magdeburg, Univ., Fak. für Wirtschaftswiss., Diss., 2013
Resumo:
Magdeburg, Univ., Fak. für Maschinenbau, Diss., 2014
Resumo:
Otto-von-Guericke-Universtität Magdeburg, Fakultät für Wirtschaftswissenschaft, Univ., Dissertation, 2015
Resumo:
L’objectiu d’aquest estudi, que correspon a un projecte de recerca sobre la pèrdua funcional i la mortalitat de persones grans fràgils, és construir un procés de supervivència predictiu que tingui en compte l’evolució funcional i nutricional dels pacients al llarg del temps. En aquest estudi ens enfrontem a l’anàlisi de dades de supervivència i mesures repetides però els mètodes estadístics habituals per al tractament conjunt d’aquest tipus de dades no són apropiats en aquest cas. Com a alternativa utilitzem els models de supervivència multi-estats per avaluar l’associació entre mortalitat i recuperació, o no, dels nivells funcionals i nutricionals considerats normals. Després d’estimar el model i d’identificar els factors pronòstics de mortalitat és possible obtenir un procés predictiu que permet fer prediccions de la supervivència dels pacients en funció de la seva història concreta fins a un determinat moment. Això permet realitzar un pronòstic més precís de cada grup de pacients, la qual cosa pot ser molt útil per als professionals sanitaris a l’hora de prendre decisions clíniques.
Resumo:
This paper presents an outline of rationale and theory of the MuSIASEM scheme (Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism). First, three points of the rationale behind our MuSIASEM scheme are discussed: (i) endosomatic and exosomatic metabolism in relation to Georgescu-Roegen’s flow-fund scheme; (2) the bioeconomic analogy of hypercycle and dissipative parts in ecosystems; (3) the dramatic reallocation of human time and land use patterns in various sectors of modern economy. Next, a flow-fund representation of the MUSIASEM scheme on three levels (the whole national level, the paid work sectors level, and the agricultural sector level) is illustrated to look at the structure of the human economy in relation to two primary factors: (i) human time - a fund; and (ii) exosomatic energy - a flow. The three levels representation uses extensive and intensive variables simultaneously. Key conceptual tools of the MuSIASEM scheme - mosaic effects and impredicative loop analysis - are explained using the three level flow-fund representation. Finally, we claim that the MuSIASEM scheme can be seen as a multi-purpose grammar useful to deal with sustainability issues.
Resumo:
Introduction: Coordination is a strategy chosen by the central nervous system to control the movements and maintain stability during gait. Coordinated multi-joint movements require a complex interaction between nervous outputs, biomechanical constraints, and pro-prioception. Quantitatively understanding and modeling gait coordination still remain a challenge. Surgeons lack a way to model and appreciate the coordination of patients before and after surgery of the lower limbs. Patients alter their gait patterns and their kinematic synergies when they walk faster or slower than normal speed to maintain their stability and minimize the energy cost of locomotion. The goal of this study was to provide a dynamical system approach to quantitatively describe human gait coordination and apply it to patients before and after total knee arthroplasty. Methods: A new method of quantitative analysis of interjoint coordination during gait was designed, providing a general model to capture the whole dynamics and showing the kinematic synergies at various walking speeds. The proposed model imposed a relationship among lower limb joint angles (hips and knees) to parameterize the dynamics of locomotion of each individual. An integration of different analysis tools such as Harmonic analysis, Principal Component Analysis, and Artificial Neural Network helped overcome high-dimensionality, temporal dependence, and non-linear relationships of the gait patterns. Ten patients were studied using an ambulatory gait device (Physilog®). Each participant was asked to perform two walking trials of 30m long at 3 different speeds and to complete an EQ-5D questionnaire, a WOMAC and Knee Society Score. Lower limbs rotations were measured by four miniature angular rate sensors mounted respectively, on each shank and thigh. The outcomes of the eight patients undergoing total knee arthroplasty, recorded pre-operatively and post-operatively at 6 weeks, 3 months, 6 months and 1 year were compared to 2 age-matched healthy subjects. Results: The new method provided coordination scores at various walking speeds, ranged between 0 and 10. It determined the overall coordination of the lower limbs as well as the contribution of each joint to the total coordination. The difference between the pre-operative and post-operative coordination values were correlated with the improvements of the subjective outcome scores. Although the study group was small, the results showed a new way to objectively quantify gait coordination of patients undergoing total knee arthroplasty, using only portable body-fixed sensors. Conclusion: A new method for objective gait coordination analysis has been developed with very encouraging results regarding the objective outcome of lower limb surgery.
Resumo:
In this paper we attempt an empirical application of the multi-region input-output (MRIO) method in order to enumerate the pollution content of interregional trade flows between five Mid-West regions/states in the US –Illinois, Indiana, Iowa, Michigan and Wisconsin – and the rest of the US. This allows us to analyse some very important issues in terms of the nature and significance of interregional environmental spillovers within the US Mid-West and the existence of pollution ‘trade balances’ between states. Our results raise questions in terms of the extent to which authorities at State level can control local emissions where they are limited in the way some emissions can be controlled, particularly with respect to changes in demand elsewhere in the Mid-West and US. This implies a need for policy co-ordination between national and state level authorities in the US to meet emissions reductions targets. The existence of an environmental trade balances between states also raises issues in terms of net losses/gains in terms of pollutants as a result of interregional trade within the US and whether, if certain activities can be carried out using less polluting technology in one region relative to others, it is better for the US as a whole if this type of relationship exists.
Resumo:
La gestión de recursos en los procesadores multi-core ha ganado importancia con la evolución de las aplicaciones y arquitecturas. Pero esta gestión es muy compleja. Por ejemplo, una misma aplicación paralela ejecutada múltiples veces con los mismos datos de entrada, en un único nodo multi-core, puede tener tiempos de ejecución muy variables. Hay múltiples factores hardware y software que afectan al rendimiento. La forma en que los recursos hardware (cómputo y memoria) se asignan a los procesos o threads, posiblemente de varias aplicaciones que compiten entre sí, es fundamental para determinar este rendimiento. La diferencia entre hacer la asignación de recursos sin conocer la verdadera necesidad de la aplicación, frente a asignación con una meta específica es cada vez mayor. La mejor manera de realizar esta asignación és automáticamente, con una mínima intervención del programador. Es importante destacar, que la forma en que la aplicación se ejecuta en una arquitectura no necesariamente es la más adecuada, y esta situación puede mejorarse a través de la gestión adecuada de los recursos disponibles. Una apropiada gestión de recursos puede ofrecer ventajas tanto al desarrollador de las aplicaciones, como al entorno informático donde ésta se ejecuta, permitiendo un mayor número de aplicaciones en ejecución con la misma cantidad de recursos. Así mismo, esta gestión de recursos no requeriría introducir cambios a la aplicación, o a su estrategia operativa. A fin de proponer políticas para la gestión de los recursos, se analizó el comportamiento de aplicaciones intensivas de cómputo e intensivas de memoria. Este análisis se llevó a cabo a través del estudio de los parámetros de ubicación entre los cores, la necesidad de usar la memoria compartida, el tamaño de la carga de entrada, la distribución de los datos dentro del procesador y la granularidad de trabajo. Nuestro objetivo es identificar cómo estos parámetros influyen en la eficiencia de la ejecución, identificar cuellos de botella y proponer posibles mejoras. Otra propuesta es adaptar las estrategias ya utilizadas por el Scheduler con el fin de obtener mejores resultados.