911 resultados para future challenges
Resumo:
Pervasive Sensing is a recent research trend that aims at providing widespread computing and sensing capabilities to enable the creation of smart environments that can sense, process, and act by considering input coming from both people and devices. The capabilities necessary for Pervasive Sensing are nowadays available on a plethora of devices, from embedded devices to PCs and smartphones. The wide availability of new devices and the large amount of data they can access enable a wide range of novel services in different areas, spanning from simple data collection systems to socially-aware collaborative filtering. However, the strong heterogeneity and unreliability of devices and sensors poses significant challenges. So far, existing works on Pervasive Sensing have focused only on limited portions of the whole stack of available devices and data that they can use, to propose and develop mainly vertical solutions. The push from academia and industry for this kind of services shows that time is mature for a more general support framework for Pervasive Sensing solutions able to enhance frail architectures, promote a well balanced usage of resources on different devices, and enable the widest possible access to sensed data, while ensuring a minimal energy consumption on battery-operated devices. This thesis focuses on pervasive sensing systems to extract design guidelines as foundation of a comprehensive reference model for multi-tier Pervasive Sensing applications. The validity of the proposed model is tested in five different scenarios that present peculiar and different requirements, and different hardware and sensors. The ease of mapping from the proposed logical model to the real implementations and the positive performance result campaigns prove the quality of the proposed approach and offer a reliable reference model, together with a direction for the design and deployment of future Pervasive Sensing applications.
Resumo:
With this dissertation research we investigate intersections between design and marketing and in this respect, which factors do contribute that a product design becomes brand formative. We have developed a Brand Formative Design (BFD) framework, which investigates individual design features in a holistic, comparable, brand relevant, and consumer specific context. We discuss what kinds of characteristics contribute to BFD but also illuminate how they should be applied and examine: rnA holistic framework leading to Brand Formative Design. Identification and assessment of BFD Drivers. The dissection of products into three Distinctive Design Levels. The detection of surprising design preferences. The appropriate degree of scheme deviation with evolutionary design. Simulated BFD development processes with three different products and the integration of consumers. Future oriented objectification, comparability and assessment of design. Recommendations for the management of design in a brand specific context. Design is a product feature, which contributes significantly to the success of products. However, the development of new design contains challenges. Design can hardly be objectified; many people have an opinion concerning the attractiveness of new products but cannot formulate their future preferences. Product design is widely developed based on intuition, which can be difficult for the management of design. Here the concept of Brand Formative Design can provide a framework which contributes to structure, objectify, develop and assess new evolutionary design in brand and future relevant contexts, but also integrates consumers and their preferences without restricting creativity too much.
Resumo:
Morbidity and mortality of myocardial infarction remains significant with resulting left ventricular function presenting as a major determinant of clinical outcome. Protecting the myocardium against ischemia reperfusion injury has become a major therapeutic goal and the identification of key signaling pathways has paved the way for various interventions, but until now with disappointing results. This article describes the recently discovered new role of G-protein-coupled receptor kinase-2 (GRK2), which is known to critically influence the development and progression of heart failure, in acute myocardial injury. This article focuses on potential applications of the GRK2 peptide inhibitor βARKct in ischemic myocardial injury, the use of GRK2 as a biomarker in acute myocardial infarction and discusses the challenges of translating GRK2 inhibition as a cardioprotective strategy to a possible future clinical application.
Resumo:
In this review, we summarize the current "state of the art" of carbapenem antibiotics and their role in our antimicrobial armamentarium. Among the β-lactams currently available, carbapenems are unique because they are relatively resistant to hydrolysis by most β-lactamases, in some cases act as "slow substrates" or inhibitors of β-lactamases, and still target penicillin binding proteins. This "value-added feature" of inhibiting β-lactamases serves as a major rationale for expansion of this class of β-lactams. We describe the initial discovery and development of the carbapenem family of β-lactams. Of the early carbapenems evaluated, thienamycin demonstrated the greatest antimicrobial activity and became the parent compound for all subsequent carbapenems. To date, more than 80 compounds with mostly improved antimicrobial properties, compared to those of thienamycin, are described in the literature. We also highlight important features of the carbapenems that are presently in clinical use: imipenem-cilastatin, meropenem, ertapenem, doripenem, panipenem-betamipron, and biapenem. In closing, we emphasize some major challenges and urge the medicinal chemist to continue development of these versatile and potent compounds, as they have served us well for more than 3 decades.
Resumo:
Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.
Resumo:
The increasing deployment of mobile communication base stations led to an increasing demand for epidemiological studies on possible health effects of radio frequency emissions. The methodological challenges of such studies have been critically evaluated by a panel of scientists in the fields of radiofrequency engineering/dosimetry and epidemiology. Strengths and weaknesses of previous studies have been identified. Dosimetric concepts and crucial aspects in exposure assessment were evaluated in terms of epidemiological studies on different types of outcomes. We conclude that in principle base station epidemiological studies are feasible. However, the exposure contributions from all relevant radio frequency sources have to be taken into account. The applied exposure assessment method should be piloted and validated. Short to medium term effects on physiology or health related quality of life are best investigated by cohort studies. For long term effects, groups with a potential for high exposure need to first be identified; for immediate effect, human laboratory studies are the preferred approach.
Resumo:
The Audiovisual Media Services Directive (AVMSD) which regulates broadcasting and on-demand audiovisual media services is at the nexus of current discussions about the convergence of media. The Green Paper of the Commission of April 2013 reflects the struggle of the European Union to come to terms with the phenomenon of convergence and highlights current legal uncertainties. The (theoretical) quest for an appropriate and future-oriented regulatory framework at the European level may be contrasted to the practice of national regulatory authorities. When faced with new media services and new business models, national regulators will inevitably have to make decisions and choices that take into account providers’ interests to offer their services as well as viewers’ interests to receive information. This balancing act performed by national regulators may tip towards the former or latter depending on the national legal framework; social, political and economic considerations; as well as cultural perceptions. This paper thus examines how certain rules contained in the AVMSD are applied by national regulators. It focuses first on the definition of an on-demand audiovisual media service and its scope. Second, it analyses the measures adopted with a view to protection minors in on-demand services and third discusses national approaches towards the promotion of European works in on-demand services. It aims at underlining the significance of national regulatory authorities and the guidelines these adopt to clarify the rules of a key EU Directive of the “media law acquis”.
Resumo:
Watershed services are the benefits people obtain from the flow of water through a watershed. While demand for such services is increasing in most parts of the world, supply is getting more insecure due to human impacts on ecosystems such as climate or land use change. Population and water management authorities therefore require information on the potential availability of watershed services in the future and the trade-offs involved. In this study, the Soil and Water Assessment Tool (SWAT) is used to model watershed service availability for future management and climate change scenarios in the East African Pangani Basin. In order to quantify actual “benefits”, SWAT2005 was slightly modified, calibrated and configured at the required spatial and temporal resolution so that simulated water resources and processes could be characterized based on their valuation by stakeholders and their accessibility. The calibrated model was then used to evaluate three management and three climate scenarios. The results show that by the year 2025, not primarily the physical availability of water, but access to water resources and efficiency of use represent the greatest challenges. Water to cover basic human needs is available at least 95% of time but must be made accessible to the population through investments in distribution infrastructure. Concerning the trade-off between agricultural use and hydropower production, there is virtually no potential for an increase in hydropower even if it is given priority. Agriculture will necessarily expand spatially as a result of population growth, and can even benefit from higher irrigation water availability per area unit, given improved irrigation efficiency and enforced regulation to ensure equitable distribution of available water. The decline in services from natural terrestrial ecosystems (e.g. charcoal, food), due to the expansion of agriculture, increases the vulnerability of residents who depend on such services mostly in times of drought. The expected impacts of climate change may contribute to an increase or decrease in watershed service availability, but are only marginal and much lower than management impacts up to the year 2025.
Resumo:
In natural hazard research, risk is defined as a function of (1) the probability of occurrence of a hazardous process, and (2) the assessment of the related extent of damage, defined by the value of elements at risk exposed and their physical vulnerability. Until now, various works have been undertaken to determine vulnerability values for objects exposed to geomorphic hazards such as mountain torrents. Yet, many studies only provide rough estimates for vulnerability values based on proxies for process intensities. However, the deduced vulnerability functions proposed in the literature show a wide range, in particular with respect to medium and high process magnitudes. In our study, we compare vulnerability functions for torrent processes derived from studies in test sites located in the Austrian Alps and in Taiwan. Based on this comparison we expose needs for future research in order to enhance mountain hazard risk management with a particular focus on the question of vulnerability on a catchment scale.
Resumo:
CYP4F (Cytochrome P4504F) enzymes metabolize endogenous molecules including leukotrienes, prostaglandins and arachidonic acid. The involvement of these endogenous compounds in inflammation has led to the hypothesis that changes in the inflamed tissue environment may affect the expression of CYP4Fs during the pro-inflammatory state, which in turn may modulate inflammatory conditions during the anti-inflammatory state. We demonstrated that inflamed tissues have different levels of CYP4F isoform expression profiles in a number of human samples when compared to the average population. The CYP4F isoform expression levels change with the degree of inflammation present in tissue. Further investigation in cell culture studies revealed that inflammatory cytokines, in particular TNF-α, play a role in regulating the expression of the CYP4F family. One of the isoforms, CYP4F11, had different characteristics than that of the other five CYP4F family members. CYP4F11 metabolizes xenobiotics while the other isoforms metabolize endogenous compounds with higher affinity. CYP4F11 also was expressed at high quantities in the brain, and was up-regulated by TNF-α, while the other isoforms were not expressed at high quantities in the brain and were down-regulated by TNF-α. We identified the AP-1 protein of the JNK pathway as the signaling protein that causes significant increase in CYP4F11 expression. Since TNF-α stimulation causes a simultaneous activation of both JNK pathway and NF-κB signaling, we investigated further the role that NF-κB plays on expression of the CYP4F11 gene. We concluded that although there is a significant increase in CYP4F11 expression in the presence of TNF-α, the activation of NF-κB signaling inhibits CYP4F11 expression in a time dependent manner. The expression of CYP4F11 is only significantly increased after 24 hours of treatment with TNF-α; at shorter time points NF-κB signaling overpowers the JNK pathway activation. We believe that these findings may in the future lead to improved drug design for modulating inflammation.
Resumo:
This editorial provides an overview of how the articles in this issue contextualize the wide-range of perceptions and practices that support (and sometimes undermine) family strengths. Data on the challenges facing today’s families are presented.
Resumo:
Despite the universal prescription of sedative drugs in the intensive care unit (ICU), current practice is not guided by high-level evidence. Landmark sedation trials have made significant contributions to our understanding of the problems associated with ICU sedation and have promoted changes to current practice. We identified challenges and limitations of clinical trials which reduced the generalizability and the universal adoption of key interventions. We present an international perspective regarding current sedation practice and a blueprint for future research, which seeks to avoid known limitations and generate much-needed high-level evidence to better guide clinicians' management and therapeutic choices of sedative agents.
Resumo:
This paper applies a policy analysis approach to the question of how to effectively regulate micropollution in a sustainable manner. Micropollution is a complex policy problem characterized by a huge number and diversity of chemical substances, as well as various entry paths into the aquatic environment. It challenges traditional water quality management by calling for new technologies in wastewater treatment and behavioral changes in industry, agriculture and civil society. In light of such challenges, the question arises as to how to regulate such a complex phenomenon to ensure water quality is maintained in the future? What can we learn from past experiences in water quality regulation? To answer these questions, policy analysis strongly focuses on the design and choice of policy instruments and the mix of such measures. In this paper, we review instruments commonly used in past water quality regulation. We evaluate their ability to respond to the characteristics of a more recent water quality problem, i.e., micropollution, in a sustainable way. This way, we develop a new framework that integrates both the problem dimension (i.e., causes and effects of a problem) as well as the sustainability dimension (e.g., long-term, cross-sectoral and multi-level) to assess which policy instruments are best suited to regulate micropollution. We thus conclude that sustainability criteria help to identify an appropriate instrument mix of end-of-pipe and source-directed measures to reduce aquatic micropollution.
Resumo:
Modeling of future water systems at the regional scale is a difficult task due to the complexity of current structures (multiple competing water uses, multiple actors, formal and informal rules) both temporally and spatially. Representing this complexity in the modeling process is a challenge that can be addressed by an interdisciplinary and holistic approach. The assessment of the water system of the Crans-Montana-Sierre area (Switzerland) and its evolution until 2050 were tackled by combining glaciological, hydrogeological, and hydrological measurements and modeling with the evaluation of water use through documentary, statistical and interview-based analyses. Four visions of future regional development were co-produced with a group of stakeholders and were then used as a basis for estimating future water demand. The comparison of the available water resource and the water demand at monthly time scale allowed us to conclude that for the four scenarios socioeconomic factors will impact on the future water systems more than climatic factors. An analysis of the sustainability of the current and future water systems based on four visions of regional development allowed us to identify those scenarios that will be more sustainable and that should be adopted by the decision-makers. The results were then presented to the stakeholders through five key messages. The challenges of communicating the results in such a way with stakeholders are discussed at the end of the article.