840 resultados para Problems and potentials


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multilevel converters have been under research and development for more than three decades and have found successful industrial application. However, this is still a technology under development, and many new contributions and new commercial topologies have been reported in the last few years. The aim of this paper is to group and review these recent contributions, in order to establish the current state of the art and trends of the technology, to provide readers with a comprehensive and insightful review of where multilevel converter technology stands and is heading. This paper first presents a brief overview of well-established multilevel converters strongly oriented to their current state in industrial applications to then center the discussion on the new converters that have made their way into the industry. In addition, new promising topologies are discussed. Recent advances made in modulation and control of multilevel converters are also addressed. A great part of this paper is devoted to show nontraditional applications powered by multilevel converters and how multilevel converters are becoming an enabling technology in many industrial sectors. Finally, some future trends and challenges in the further development of this technology are discussed to motivate future contributions that address open problems and explore new possibilities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Usher syndrome (USH) is an inherited blindness and deafness disorder with variable vestibular dysfunction. The syndrome is divided into three subtypes according to the progression and severity of clinical symptoms. The gene mutated in Usher syndrome type 3 (USH3), clarin 1 (CLRN1), was identified in Finland in 2001 and two mutations were identified in Finnish patients at that time. Prior to this thesis study, the two CLRN1 gene mutations were the only USH mutations identified in Finnish USH patients. To further clarify the Finnish USH mutation spectrum, all nine USH genes were studied. Seven mutations were identified: one was a previously known mutation in CLRN1, four were novel mutations in myosin VIIa (MYO7A) and two were a novel and a previously known mutation in usherin (USH2A). Another aim of this thesis research was to further study the structure and function of the CLRN1 gene, and to clarify the effects of mutations on protein function. The search for new splice variants resulted in the identification of eight novel splice variants in addition to the three splice variants that were already known prior to this study. Studies of the possible promoter regions for these splice variants showed the most active region included the 1000 bases upstream of the translation start site in the first exon of the main three exon splice variant. The 232 aa CLRN1 protein encoded by the main (three-exon) splice variant was transported to the plasma membrane when expressed in cultured cells. Western blot studies suggested that CLRN1 forms dimers and multimers. The CLRN1 mutant proteins studied were retained in the endoplasmic reticulum (ER) and some of the USH3 mutations caused CLRN1 to be unstable. During this study, two novel CLRN1 sequence alterations were identified and their pathogenicity was studied with cell culture protein expression. Previous studies with mice had shown that Clrn1 is expressed in mouse cochlear hair cells and spiral ganglion cells, but the expression profile in mouse retina remained unknown. The Clrn1 knockout mice display cochlear cell disruption/death, but do not have a retinal phenotype. The zebrafish, Danio rerio, clrn1 was found to be expressed in hair cells associated with hearing and balance. Clrn1 expression was also found in the inner nuclear layer (INL), photoreceptor layer and retinal pigment epithelium layer (RPE) of the zebrafish retina. When Clrn1 production was knocked down with injected morpholino oligonucleotides (MO) targeting Clrn1 translation or correct splicing, the zebrafish larvae showed symptoms similar to USH3 patients. These larvae had balance/hearing problems and reduced response to visual stimuli. The knowledge this thesis research has provided about the mutations in USH genes and the Finnish USH mutation spectrum are important in USH patient diagnostics. The extended information about the structure and function of CLRN1 is a step further in exploring USH3 pathogenesis caused by mutated CLRN1 as well as a step in finding a cure for the disease.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traumatic brain injury (TBI) affects people of all ages and is a cause of long-term disability. In recent years, the epidemiological patterns of TBI have been changing. TBI is a heterogeneous disorder with different forms of presentation and highly individual outcome regarding functioning and health-related quality of life (HRQoL). The meaning of disability differs from person to person based on the individual s personality, value system, past experience, and the purpose he or she sees in life. Understanding of all these viewpoints is needed in comprehensive rehabilitation. This study examines the epidemiology of TBI in Finland as well as functioning and HRQoL after TBI, and compares the subjective and objective assessments of outcome. The frame of reference is the International Classification of Functioning, Disability and Health (ICF). The subjects of Study I represent the population of Finnish TBI patients who experienced their first TBI between 1991 and 2005. The 55 Finnish subjects of Studies II and IV participated in the first wave of the international Quality of life after brain injury (QOLIBRI) validation study. The 795 subjects from six language areas of Study III formed the second wave of the QOLIBRI validation study. The average annual incidence of Finnish hospitalised TBI patients during the years 1991-2005 was 101:100 000 in patients who had TBI as the primary diagnosis and did not have a previous TBI in their medical history. Males (59.2%) were at considerably higher risk of getting a TBI than females. The most common external cause of the injury was falls in all age groups. The number of TBI patients ≥ 70 years of age increased by 59.4% while the number of inhabitants older than 70 years increased by 30.3% in the population of Finland during the same time period. The functioning of a sample of 55 persons with TBI was assessed by extracting information from the patients medical documents using the ICF checklist. The most common problems were found in the ICF components of Body Functions (b) and Activities and Participation (d). HRQoL was assessed with the QOLIBRI which showed the highest level of satisfaction on the Emotions, Physical Problems and Daily Life and Autonomy scales. The highest scores were obtained by the youngest participants and participants living independently without the help of other people, and by people who were working. The relationship between the functional outcome and HRQoL was not straightforward. The procedure of linking the QOLIBRI and the GOSE to the ICF showed that these two outcome measures cover the relevant domains of TBI patients functioning. The QOLIBRI provides the patients subjective view, while the GOSE summarises the objective elements of functioning. Our study indicates that there are certain domains of functioning that are not traditionally sufficiently documented but are important for the HRQoL of persons with TBI. This was the finding especially in the domains of interpersonal relationships, social and leisure activities, self, and the environment. Rehabilitation aims to optimize functioning and to minimize the experience of disability among people with health conditions, and it needs to be based on a comprehensive understanding of human functioning. As an integrative model, the ICF may serve as a frame of reference in achieving such an understanding.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The dissertation examines the role of the EU courts in new governance. New governance has raised unprecedented interest in the EU in recent years. This is manifested in a plethora of instruments and actors at various levels that challenge more traditional forms of command-and-control regulation. New governance and political experimentation more generally is thought to sap the ability of the EU judiciary to monitor and review these experiments. The exclusion of the courts is then seen to add to the legitimacy problem of new governance. The starting point of this dissertation is the observation that the marginalised role of the courts is based on theoretical and empirical assumptions which invite scrutiny. The theoretical framework of the dissertation is deliberative democracy and democratic experimentalism. The analysis of deliberative democracy is sustained by an attempt to apply theoretical concepts to three distinctive examples of governance in the EU. These are the EU Sustainable Development Strategy, the European Chemicals Agency, and the Common Implementation Strategy for the Water Framework Directive. The case studies show numerous disincentives and barriers to judicial review. Among these are questions of the role of courts in shaping governance frameworks, the reviewability of science-based measures, the standing of individuals before the courts, and the justiciability of soft law. The dissertation analyses the conditions of judicial review in each governance environment and proposes improvements. From a more theoretical standpoint it could be said that each case study presents a governance regime which builds on legislation that lays out major (guide)lines but leaves details to be filled out at a later stage. Specification of detailed standards takes place through collaborative networks comprising members from national administrations, NGOs, and the Commission. Viewed this way, deliberative problem-solving is needed to bring people together to clarify, elaborate, and revise largely abstract and general norms in order to resolve concrete and specific problems and to make law applicable and enforceable. The dissertation draws attention to the potential of peer review included there and its profound consequences for judicial accountability structures. It is argued that without this kind of ongoing and dynamic peer review of accountability in governance frameworks, judicial review of new governance is difficult and in some cases impossible. This claim has implications for how we understand the concept of soft law, the role of the courts, participation rights, and the legitimacy of governance measures more generally. The experimentalist architecture of judicial decision-making relies upon a wide variety of actors to provide conditions for legitimate and efficient review.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Since the 1970s alcohol and drug use by pregnant women has become a target of political, professional and personal concern. The present study focuses on prenatal substance use and the regulation of risks by examining different kinds of societal responses to prenatal alcohol and drug use. The study analyses face-to-face encounters between professionals and service users at a specialised maternity clinic for pregnant women with substance abuse problems, medical and political discourses on the compulsory treatment of pregnant women as a means of FAS prevention and official recommendations on alcohol intake during pregnancy. Moreover, the study addresses the women s perspective by asking how women who have used illicit drugs during pregnancy perceive and rank the dangers linked to drug use. The study consists of five empirical sub-studies and a summary article. Sub-study I was written in collaboration with Dorte Hecksher and Sub-study IV with Riikka Perälä. Theoretically the study builds on the one hand, on the socio-cultural approach to the selection and perception of risks and on the other on governmentality studies which focus on the use of power in contemporary Western societies. The study is based on an ethnographic approach and makes use of the principles of multi-sited ethnography. The empirical sub-studies are based on three different types of qualitative data: ethnographic field notes from a maternity clinic from a period of 7 months, documentary material (medical journals, political documents, health education materials, government reports) and 3) interviews from maternity clinics with clients and members of staff. The study demonstrates that the logic of the regulation of prenatal alcohol use in Finland is characterised by the rise of the foetus , a process in which the urgency of protecting the foetus has gradually gained a more prominent role in the discourses on alcohol-related foetal damage. An increasing unwillingness to accept any kinds of risks when foetal health is at stake is manifested in the public debate on the compulsory treatment of pregnant women with alcohol problems and in the health authorities decision to advise pregnant women to refrain from alcohol use during pregnancy (Sub-studies I and II). Secondly, the study suggests that maternity care professionals have an ambivalent role in their mundane encounters with their pregnant clients: on the one hand professionals focus on the well-being of the foetus, but on the other, they need to take into account the women s needs and agency. The professionals daily encounters with their clients are thus characterised by hybridisation: the simultaneous use of technologies of domination and technologies of agency (Sub-studies III and IV). Finally, the study draws attention to the women s understanding of the risks of illicit drug during pregnancy, and shows that the women s understanding of risk differs from the bio-medical view. The study suggests that when drug-using pregnant women seek professional help they can feel that their moral worth is threatened by professionals negative attitudes which can make service-use challenging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aquatic ecosystems are dynamic and depend on various interdependent and inter-related factors that are vital for their existence and in maintaining the ecological balance. Various anthropogenic activities have impaired ecological conditions in many ecosystems. This monograph gives an account of the essentials in limnology, which helps in understanding the nature and extent of the problems and also provides an insight into the use of Geographic Information System as an effective tool for resource inventorying, monitoring and management. The monograph consists of four chapters, and the first one gives an overall view of the inland aquatic bodies as complex ecological systems. It begins with the formation of lakes, and the various physical, chemical and biological factors that determine these ecosystems. The physical factors covered include morphometry, density, light, etc., and the lake chemistry determined by various anions and cations are discussed in detail. The biological parameters include phytoplankton, zooplankton, waterfowl and fish communities that play an important role in freshwater biodiversity, and are presented with diagrams for easy understanding. The monograph gives an in depth view of the lake zones, productivity, and seasonal changes in the lake community with various energy relationships. The concept of food chain and food web in an aquatic ecosystem is also presented with illustrations. Lastly, the various anthropogenic activities that have deteriorated the quality of water are listed with the restoration strategies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The questions that one should answer in engineering computations - deterministic, probabilistic/randomized, as well as heuristic - are (i) how good the computed results/outputs are and (ii) how much the cost in terms of amount of computation and the amount of storage utilized in getting the outputs is. The absolutely errorfree quantities as well as the completely errorless computations done in a natural process can never be captured by any means that we have at our disposal. While the computations including the input real quantities in nature/natural processes are exact, all the computations that we do using a digital computer or are carried out in an embedded form are never exact. The input data for such computations are also never exact because any measuring instrument has inherent error of a fixed order associated with it and this error, as a matter of hypothesis and not as a matter of assumption, is not less than 0.005 per cent. Here by error we imply relative error bounds. The fact that exact error is never known under any circumstances and any context implies that the term error is nothing but error-bounds. Further, in engineering computations, it is the relative error or, equivalently, the relative error-bounds (and not the absolute error) which is supremely important in providing us the information regarding the quality of the results/outputs. Another important fact is that inconsistency and/or near-consistency in nature, i.e., in problems created from nature is completely nonexistent while in our modelling of the natural problems we may introduce inconsistency or near-inconsistency due to human error or due to inherent non-removable error associated with any measuring device or due to assumptions introduced to make the problem solvable or more easily solvable in practice. Thus if we discover any inconsistency or possibly any near-inconsistency in a mathematical model, it is certainly due to any or all of the three foregoing factors. We do, however, go ahead to solve such inconsistent/near-consistent problems and do get results that could be useful in real-world situations. The talk considers several deterministic, probabilistic, and heuristic algorithms in numerical optimisation, other numerical and statistical computations, and in PAC (probably approximately correct) learning models. It highlights the quality of the results/outputs through specifying relative error-bounds along with the associated confidence level, and the cost, viz., amount of computations and that of storage through complexity. It points out the limitation in error-free computations (wherever possible, i.e., where the number of arithmetic operations is finite and is known a priori) as well as in the usage of interval arithmetic. Further, the interdependence among the error, the confidence, and the cost is discussed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In todays era of energy crisis and global warming, hydrogen has been projected as a sustainable alternative to depleting CO2-emitting fossil fuels. However, its deployment as an energy source is impeded by many issues, one of the most important being storage. Chemical hydrogen storage materials, in particular B?N compounds such as ammonia borane, with a potential storage capacity of 19.6 wt?% H2 and 0.145 kg?H?2?L-1, have been intensively studied from the standpoint of addressing the storage issues. Ammonia borane undergoes dehydrogenation through hydrolysis at room temperature in the presence of a catalyst, but its practical implementation is hindered by several problems affecting all of the chemical compounds in the reaction scheme, including ammonia borane, water, borate byproducts, and hydrogen. In this Minireview, we exhaustively survey the state of the art, discuss the fundamental problems, and, where applicable, propose solutions with the prospect of technological applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ground management problems are typically solved by the simulation-optimization approach where complex numerical models are used to simulate the groundwater flow and/or contamination transport. These numerical models take a lot of time to solve the management problems and hence become computationally expensive. In this study, Artificial Neural Network (ANN) and Particle Swarm Optimization (PSO) models were developed and coupled for the management of groundwater of Dore river basin in France. The Analytic Element Method (AEM) based flow model was developed and used to generate the dataset for the training and testing of the ANN model. This developed ANN-PSO model was applied to minimize the pumping cost of the wells, including cost of the pipe line. The discharge and location of the pumping wells were taken as the decision variable and the ANN-PSO model was applied to find out the optimal location of the wells. The results of the ANN-PSO model are found similar to the results obtained by AEM-PSO model. The results show that the ANN model can reduce the computational burden significantly as it is able to analyze different scenarios, and the ANN-PSO model is capable of identifying the optimal location of wells efficiently.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Detailed analysis of alternating current impedance data of LiMn2O4 electrodes measured at several temperatures and potentials was carried out. The Nyquist plots generally consisted of semicircles corresponding to two time constants. However, at low temperatures (-10 to 10 A degrees C) and potential region between 3.90 and 4.20 V, three time constants were present. The third semicircle present at the middle to high frequency range was attributed to electronic resistance of LiMn2O4. Impedance parameters were evaluated using appropriate electrical equivalent circuits. From the temperature dependence of resistive parameters, activation energy values for the corresponding processes were calculated.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Structural Support Vector Machines (SSVMs) and Conditional Random Fields (CRFs) are popular discriminative methods used for classifying structured and complex objects like parse trees, image segments and part-of-speech tags. The datasets involved are very large dimensional, and the models designed using typical training algorithms for SSVMs and CRFs are non-sparse. This non-sparse nature of models results in slow inference. Thus, there is a need to devise new algorithms for sparse SSVM and CRF classifier design. Use of elastic net and L1-regularizer has already been explored for solving primal CRF and SSVM problems, respectively, to design sparse classifiers. In this work, we focus on dual elastic net regularized SSVM and CRF. By exploiting the weakly coupled structure of these convex programming problems, we propose a new sequential alternating proximal (SAP) algorithm to solve these dual problems. This algorithm works by sequentially visiting each training set example and solving a simple subproblem restricted to a small subset of variables associated with that example. Numerical experiments on various benchmark sequence labeling datasets demonstrate that the proposed algorithm scales well. Further, the classifiers designed are sparser than those designed by solving the respective primal problems and demonstrate comparable generalization performance. Thus, the proposed SAP algorithm is a useful alternative for sparse SSVM and CRF classifier design.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The trapezoidal rule, which is a special case of the Newmark family of algorithms, is one of the most widely used methods for transient hyperbolic problems. In this work, we show that this rule conserves linear and angular momenta and energy in the case of undamped linear elastodynamics problems, and an ``energy-like measure'' in the case of undamped acoustic problems. These conservation properties, thus, provide a rational basis for using this algorithm. In linear elastodynamics problems, variants of the trapezoidal rule that incorporate ``high-frequency'' dissipation are often used, since the higher frequencies, which are not approximated properly by the standard displacement-based approach, often result in unphysical behavior. Instead of modifying the trapezoidal algorithm, we propose using a hybrid finite element framework for constructing the stiffness matrix. Hybrid finite elements, which are based on a two-field variational formulation involving displacement and stresses, are known to approximate the eigenvalues much more accurately than the standard displacement-based approach, thereby either bypassing or reducing the need for high-frequency dissipation. We show this by means of several examples, where we compare the numerical solutions obtained using the displacement-based and hybrid approaches against analytical solutions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The remarkable capability of nature to design and create excellent self-assembled nano-structures, especially in the biological world, has motivated chemists to mimic such systems with synthetic molecular and supramolecular systems. The hierarchically organized self-assembly of low molecular weight gelators (LMWGs) based on non-covalent interactions has been proven to be a useful tool in the development of well-defined nanostructures. Among these, the self-assembly of sugar-derived LMWGs has received immense attention because of their propensity to furnish biocompatible, hierarchical, supramolecular architectures that are macroscopically expressed in gel formation. This review sheds light on various aspects of sugar-derived LMWGs, uncovering their mechanisms of gelation, structural analysis, and tailorable properties, and their diverse applications such as stimuli-responsiveness, sensing, self-healing, environmental problems, and nano and biomaterials synthesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Biomolecular structure elucidation is one of the major techniques for studying the basic processes of life. These processes get modulated, hindered or altered due to various causes like diseases, which is why biomolecular analysis and imaging play an important role in diagnosis, treatment prognosis and monitoring. Vibrational spectroscopy (IR and Raman), which is a molecular bond specific technique, can assist the researcher in chemical structure interpretation. Based on the combination with microscopy, vibrational microspectroscopy is currently emerging as an important tool for biomedical research, with a spatial resolution at the cellular and sub-cellular level. These techniques offer various advantages, enabling label-free, biomolecular fingerprinting in the native state. However, the complexity involved in deciphering the required information from a spectrum hampered their entry into the clinic. Today with the advent of automated algorithms, vibrational microspectroscopy excels in the field of spectropathology. However, researchers should be aware of how quantification based on absolute band intensities may be affected by instrumental parameters, sample thickness, water content, substrate backgrounds and other possible artefacts. In this review these practical issues and their effects on the quantification of biomolecules will be discussed in detail. In many cases ratiometric analysis can help to circumvent these problems and enable the quantitative study of biological samples, including ratiometric imaging in 1D, 2D and 3D. We provide an extensive overview from the recent scientific literature on IR and Raman band ratios used for studying biological systems and for disease diagnosis and treatment prognosis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Both earthquake prediction and failure prediction of disordered brittle media are difficult and complicated problems and they might have something in common. In order to search for clues for earthquake prediction, the common features of failure in a simple nonlinear dynamical model resembling disordered brittle media are examined. It is found that the failure manifests evolution-induced catastrophe (EIC), i.e., the abrupt transition from globally stable (GS) accumulation of damage to catastrophic failure. A distinct feature is the significant uncertainty of catastrophe, called sample-specificity. Consequently, it is impossible to make a deterministic prediction macroscopically. This is similar to the question of predictability of earthquakes. However, our model shows that strong stress fluctuations may be an immediate precursor of catastrophic failure statistically. This might provide clues for earthquake forecasting.