824 resultados para Future of ILL.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper identifies two narratives of the Anthropocene and explores how they play out in the realm of future-looking fashion production. Each narrative draws on mythic comparisons to gods and monsters to express humanity’s dilemmas, albeit from different perspectives. The first is a Malthusian narrative of collapse and scarcity, brought about by the monstrous, unstoppable nature of human technology set loose on the natural world. In this vein, philosopher Slavoj Zizek (2010) draws on Biblical analogies, likening ecological crisis to one of the four horsemen of the apocalypse. To find a myth to suit the present times, novelist A.S Byatt (2011) proposes Ragnarök, a Norse myth in which the gods destroy themselves. In contrast, the second narrative is one of technological cornucopia. Stewart Brand (2009, 27), self-described ‘eco-pragmatist’ writes, ‘we are as gods and we have to get good at it’. In his view, human technologies offer the only hope to mitigating the problems caused by human technology – Brand suggests harnessing nuclear power, bioengineering of crops and the geoengineering of the planet as the way forward. Similarly, the French philosopher Bruno Latour (2012, 274), exhorts us to “love our monsters”, likening our technologies to Doctor Frankenstein’s monster – set loose upon the world, and then reviled by his creator. For both Brand and Latour, human technology may be monstrous, but it must also be turned toward solutions. Within this schema, hopeful visions of the future of fashion are similarly divided. In the techno-enabled cornucopian future, the fashion industry embraces wearable technology, speed and efficiency. Technologies such as waterless dyeing, 3D printing and self-cleaning garments shift fashion into a new era of cleaner production. Meanwhile, in the narrative of scarcity, a more cautious approach sees fashion return to a new localism and valuing of the hand-made in a time of shrinking resources. Through discussion of future-looking fashion designers, brands, and activists, this paper explores how they may align along a spectrum to one of these two grand narratives of the future. The paper will discuss how these narratives may unconsciously shape the perspective of both producers and users around the fashion of today and the fashion of tomorrow. This paper poses the question: what stories can be written for fashion’s future in the Anthropocene, and are they fated, or can they be re-written?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last time a peer-reviewed volume on the future of mental health facilities was produced was in 1959, following a symposium organised by the American Psychological Association. The consensus was easy enough to follow and still resonates today: the best spaces to treat psychiatric illness will be in smaller, less restrictive units that offer more privacy and allow greater personalisation of space – possibly a converted hotel (Goshen, 1959). In some way, all those ideals have come to pass. An ideal typology was never established, but even so, units have shrunk from thousands of beds to units that typically house no more than 50 patients. Patients are generally more independent and are free to wander (within a unit) as they please. But the trend toward smaller and freer is reversing. This change is not driven by a desire to find the ideal building nor better models of care, but by growing concerns about budgets, self-harm and psychiatric violence. This issue of the Facilities comes at a time when the healthcare design is increasingly dominated by codes, statutes and guidelines. But the articles herein are a call to stop and think. We are not at the point where guidelines can be helpful, because they do not embody any depth of knowledge nor wisdom. These articles are intended to inject some new research on psychiatric/environmental interactions and also to remind planners and managers that guidelines might not tackle a core misunderstanding: fear-management about patient safety and the safety of society is not the purpose of the psychiatric facility. It is purpose is to create spaces that are suitable for improving the well-being of the mentally ill.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the dawn of civilization, natural resources have remained the mainstay of various remedial approaches of humans vis-a-vis a large number of illnesses. Saraca asoca (Roxb.) de Wilde (Saraca indica L.) belonging to the family Caesalpiniaceae has been regarded as a universal panacea in old Indian Ayurvedic texts and has especially been used to manage gynaecological complications and infections besides treating haemmorhagic dysentery, uterine pain, bacterial infections, skin problems, tumours, worm infestations, cardiac and circulatory problems. Almost all parts of the plant are considered pharmacologically valuable. Extensive folkloric practices and ethnobotanical applications of this plant have even lead to the availability of several commercial S. asoca formulations recommended for different indications though adulteration of these remains a pressing concern. Though a wealth of knowledge on this plant is available in both the classical and modern literature, extensive research on its phytomedicinal worth using state-of-the-art tools and methodologies is lacking. Recent reports on bioprospecting of S. asoca endophytic fungi for industrial bioproducts and useful pharmacologically relevant metabolites provide a silver lining to uncover single molecular bio-effectors from its endophytes. Here, we describe socio-ethnobotanical usage, present the current pharmacological status and discuss potential bottlenecks in harnessing the proclaimed phytomedicinal worth of this prescribed Ayurvedic medicinal plant. Finally, we also look into the possible future of the drug discovery and pharmaceutical R&D efforts directed at exploring its pharma legacy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There has been relatively little research into health inequalities in older populations. This may be partly explained by the difficulty in identifying appropriate indicators of socio-economic status for older people. Ideally, indicators of socio-economic status to be used in studies of health inequalities in older populations should incorporate some measure of life-time socio-economic standing, and house value may fill this role. This study examined whether an indicator of accumulated wealth based on a combination of housing tenure and house value was a strong predictor of ill-health in older populations.
Methods: A total of 191 848 people aged =65 years and not living in communal establishments were identified from the 2001 Northern Ireland Census and followed for 5 years. Self-reported health and mortality risk by housing tenure/house value groupings were examined while controlling for a range of other demographic and socio-economic characteristics.
Results: Housing tenure/house value was highly correlated with other indicators of socio-economic status. Public-sector renters had worse self-reported health and higher mortality rates than owner occupiers but significant gradients were also found between those living in the highest-and lowest-valued owner-occupier properties. The relationship between housing tenure and value was unchanged by adjustment for indicators of social support and quality of the physical environment. Adjustment for limiting long-term illness and self-reported health at baseline narrowed but did not eliminate the health gains associated with living in more expensive housing.
Conclusions: House value of residence is an accessible and powerful indicator of accumulated wealth that is highly correlated with current health status and predictive of future mortality risk in older populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As técnicas estatísticas são fundamentais em ciência e a análise de regressão linear é, quiçá, uma das metodologias mais usadas. É bem conhecido da literatura que, sob determinadas condições, a regressão linear é uma ferramenta estatística poderosíssima. Infelizmente, na prática, algumas dessas condições raramente são satisfeitas e os modelos de regressão tornam-se mal-postos, inviabilizando, assim, a aplicação dos tradicionais métodos de estimação. Este trabalho apresenta algumas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, em particular na estimação de modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. A investigação é desenvolvida em três vertentes, nomeadamente na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, na estimação do parâmetro ridge em regressão ridge e, por último, em novos desenvolvimentos na estimação com máxima entropia. Na estimação de eficiência técnica com fronteiras de produção condicionadas a estados contingentes, o trabalho desenvolvido evidencia um melhor desempenho dos estimadores de máxima entropia em relação ao estimador de máxima verosimilhança. Este bom desempenho é notório em modelos com poucas observações por estado e em modelos com um grande número de estados, os quais são comummente afetados por colinearidade. Espera-se que a utilização de estimadores de máxima entropia contribua para o tão desejado aumento de trabalho empírico com estas fronteiras de produção. Em regressão ridge o maior desafio é a estimação do parâmetro ridge. Embora existam inúmeros procedimentos disponíveis na literatura, a verdade é que não existe nenhum que supere todos os outros. Neste trabalho é proposto um novo estimador do parâmetro ridge, que combina a análise do traço ridge e a estimação com máxima entropia. Os resultados obtidos nos estudos de simulação sugerem que este novo estimador é um dos melhores procedimentos existentes na literatura para a estimação do parâmetro ridge. O estimador de máxima entropia de Leuven é baseado no método dos mínimos quadrados, na entropia de Shannon e em conceitos da eletrodinâmica quântica. Este estimador suplanta a principal crítica apontada ao estimador de máxima entropia generalizada, uma vez que prescinde dos suportes para os parâmetros e erros do modelo de regressão. Neste trabalho são apresentadas novas contribuições para a teoria de máxima entropia na estimação de modelos mal-postos, tendo por base o estimador de máxima entropia de Leuven, a teoria da informação e a regressão robusta. Os estimadores desenvolvidos revelam um bom desempenho em modelos de regressão linear com pequenas amostras, afetados por colinearidade e outliers. Por último, são apresentados alguns códigos computacionais para estimação com máxima entropia, contribuindo, deste modo, para um aumento dos escassos recursos computacionais atualmente disponíveis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent research has suggested that relatively cold UK winters are more common when solar activity is low (Lockwood et al 2010 Environ. Res. Lett. 5 024001). Solar activity during the current sunspot minimum has fallen to levels unknown since the start of the 20th century (Lockwood 2010 Proc. R. Soc. A 466 303–29) and records of past solar variations inferred from cosmogenic isotopes (Abreu et al 2008 Geophys. Res. Lett. 35 L20109) and geomagnetic activity data (Lockwood et al 2009 Astrophys. J. 700 937–44) suggest that the current grand solar maximum is coming to an end and hence that solar activity can be expected to continue to decline. Combining cosmogenic isotope data with the long record of temperatures measured in central England, we estimate how solar change could influence the probability in the future of further UK winters that are cold, relative to the hemispheric mean temperature, if all other factors remain constant. Global warming is taken into account only through the detrending using mean hemispheric temperatures. We show that some predictive skill may be obtained by including the solar effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Observational evidence indicates significant regional trends in solar radiation at the surface in both all-sky and cloud-free conditions. Negative trends in the downwelling solar surface irradiance (SSI) have become known as ‘dimming’ while positive trends have become known as ‘brightening’. We use the Met Office Hadley Centre HadGEM2 climate model to model trends in cloud-free and total SSI from the pre-industrial to the present-day and compare these against observations. Simulations driven by CMIP5 emissions are used to model the future trends in dimming/brightening up to the year 2100. The modeled trends are reasonably consistent with observed regional trends in dimming and brightening which are due to changes in concentrations in anthropogenic aerosols and, potentially, changes in cloud cover owing to the aerosol indirect effects and/or cloud feedback mechanisms. The future dimming/brightening in cloud-free SSI is not only caused by changes in anthropogenic aerosols: aerosol impacts are overwhelmed by a large dimming caused by increases in water vapor. There is little trend in the total SSI as cloud cover decreases in the climate model used here, and compensates the effect of the change in water vapor. In terms of the surface energy balance, these trends in SSI are obviously more than compensated by the increase in the downwelling terrestrial irradiance from increased water vapor concentrations. However, the study shows that while water vapor is widely appreciated as a greenhouse gas, water vapor impacts on the atmospheric transmission of solar radiation and the future of global dimming/brightening should not be overlooked.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This conference was an unusual and interesting event. Celebrating 25 years of Construction Management and Economics provides us with an opportunity to reflect on the research that has been reported over the years, to consider where we are now, and to think about the future of academic research in this area. Hence the sub-title of this conference: “past, present and future”. Looking through these papers, some things are clear. First, the range of topics considered interesting has expanded hugely since the journal was first published. Second, the research methods are also more diverse. Third, the involvement of wider groups of stakeholder is evident. There is a danger that this might lead to dilution of the field. But my instinct has always been to argue against the notion that Construction Management and Economics represents a discipline, as such. Granted, there are plenty of university departments around the world that would justify the idea of a discipline. But the vast majority of academic departments who contribute to the life of this journal carry different names to this. Indeed, the range and breadth of methodological approaches to the research reported in Construction Management and Economics indicates that there are several different academic disciplines being brought to bear on the construction sector. Some papers are based on economics, some on psychology and others on operational research, sociology, law, statistics, information technology, and so on. This is why I maintain that construction management is not an academic discipline, but a field of study to which a range of academic disciplines are applied. This may be why it is so interesting to be involved in this journal. The problems to which the papers are applied develop and grow. But the broad topics of the earliest papers in the journal are still relevant today. What has changed a lot is our interpretation of the problems that confront the construction sector all over the world, and the methodological approaches to resolving them. There is a constant difficulty in dealing with topics as inherently practical as these. While the demands of the academic world are driven by the need for the rigorous application of sound methods, the demands of the practical world are quite different. It can be difficult to meet the needs of both sets of stakeholders at the same time. However, increasing numbers of postgraduate courses in our area result in larger numbers of practitioners with a deeper appreciation of what research is all about, and how to interpret and apply the lessons from research. It also seems that there are contributions coming not just from construction-related university departments, but also from departments with identifiable methodological traditions of their own. I like to think that our authors can publish in journals beyond the construction-related areas, to disseminate their theoretical insights into other disciplines, and to contribute to the strength of this journal by citing our articles in more mono-disciplinary journals. This would contribute to the future of the journal in a very strong and developmental way. The greatest danger we face is in excessive self-citation, i.e. referring only to sources within the CM&E literature or, worse, referring only to other articles in the same journal. The only way to ensure a strong and influential position for journals and university departments like ours is to be sure that our work is informing other academic disciplines. This is what I would see as the future, our logical next step. If, as a community of researchers, we are not producing papers that challenge and inform the fundamentals of research methods and analytical processes, then no matter how practically relevant our output is to the industry, it will remain derivative and secondary, based on the methodological insights of others. The balancing act between methodological rigour and practical relevance is a difficult one, but not, of course, a balance that has to be struck in every single paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This essay reviews the ways in which literary manuscripts may be considered to be archivally unique, as well as valuable in all senses of the word, and gives a cautious appraisal of their future in the next ten to twenty years. It reviews the essential nature of literary manuscripts, and especially the ways in which they form “split collections”. This leads to an assessment of the work of the Diasporic Literary Archives network from 2012 to 2014, and some of the key findings. The essay closes with reflections on the future of literary manuscripts in the digital age – emerging trends, research findings, uncertainties and unknowns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A probable capture of Phobos into an interesting resonance was presented in our previous work. With a simple model, considering Mars in a Keplerian and circular orbit, it was shown that once captured in the resonance, the inclination of the satellite reaches very high values. Here, the integrations are extended to much longer times and escape situations are analyzed. These escapes are due to the interaction of new additional resonances, which appear as the inclination starts to increase reaching some specific values. Compared to classical capture in mean motion resonances, we see some interesting differences in this problem. We also include the effect of Mars' eccentricity in the process of the capture. The role played by this eccentricity becomes important, particularly when Phobos encounters a double resonance at a approximate to 2.619R(M). Planetary perturbations acting on Mars and variation of its equator are also included. In general, some possible scenarios of the future of Phobos are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sweden’s annual security and defence conference, which this year focused on the future of the country’s security policy, was held in Sälen on 12-14 January. It was attended by almost all the leaders of Sweden’s ruling and opposition parties. The discussions have revealed whether and how the mindset of the Swedish elite has changed following the heated debates on defence issues in 2013. The opposition parties (Social Democrats, the Green Party, and the Left Party), which are likely to form a coalition government after the election to the Swedish parliament in September 2014, were given the opportunity to present their own priorities. The discussions have brought to the surface conflicting perceptions within the political elite concerning the threats and challenges to Swedish security, and divergent positions on the future direction of the country’s security and defence policy. It is highly likely that, due to a coalition compromise, the current course of Sweden’s security policy (namely, a policy of non-alignment along with close co-operation with NATO) will be maintained following the parliamentary election, albeit with new “leftist” influences (a greater involvement in the United Nations). Big changes that could lead to a significant strengthening of Sweden’s defence capabilities, or a decision on NATO membership, are not likely. Paradoxically, polls suggest that in the long run a more radical change in Stockholm’s security policy may be shaped by a gradual, bottom-up evolution of public opinion on the issue.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last five years deep cracks have appeared in the European project. The 'euro-area crisis' triggered by a severe global financial and economic crisis has put European integration to a major test, more profound than ever before. The experience of recent years has revealed and exacerbated significant deficiencies in the European Union's (EU) economic and political construction. At time it has cast doubt on fundamentals of the European project and raised questions about whether Europe will be able to deal effectively not only with the immediate crisis, but also with the many other serious socio-economic, politico-institutional, societal and global challenges that Europe is and will be confronted with. At the start of a new institutional-political cycle (2014-2019) and while the crisis situation has for a number of reasons improved significantly since the summer of 2012, at least in systemic terms, the Union's new leadership and Member States will now have to take strategic decisions about the future of European integration.