748 resultados para Explosion
Resumo:
iPTF14atg, a subluminous peculiar Type Ia supernova (SN Ia) similar to SN 2002es, is the first SN Ia for which a strong UV flash was observed in the early-time light curves. This has been interpreted as evidence for a single-degenerate (SD) progenitor system, where such a signal is expected from interactions between the SN ejecta and the non-degenerate companion star. Here, we compare synthetic observables of multidimensional state-of-the-art explosion models for different progenitor scenarios to the light curves and spectra of iPTF14atg. From our models, we have difficulties explaining the spectral evolution of iPTF14atg within the SD progenitor channel. In contrast, we find that a violent merger of two carbon-oxygen white dwarfs with 0.9 and 0.76 M⊙, respectively, provides an excellent match to the spectral evolution of iPTF14atg from 10 d before to several weeks after maximum light. Our merger model does not naturally explain the initial UV flash of iPTF14atg. We discuss several possibilities like interactions of the SN ejecta with the circumstellar medium and surface radioactivity from an He-ignited merger that may be able to account for the early UV emission in violent merger models.
Resumo:
We present self-consistent, axisymmetric core-collapse supernova simulations performed with the Prometheus-Vertex code for 18 pre-supernova models in the range of 11–28 M ⊙, including progenitors recently investigated by other groups. All models develop explosions, but depending on the progenitor structure, they can be divided into two classes. With a steep density decline at the Si/Si–O interface, the arrival of this interface at the shock front leads to a sudden drop of the mass-accretion rate, triggering a rapid approach to explosion. With a more gradually decreasing accretion rate, it takes longer for the neutrino heating to overcome the accretion ram pressure and explosions set in later. Early explosions are facilitated by high mass-accretion rates after bounce and correspondingly high neutrino luminosities combined with a pronounced drop of the accretion rate and ram pressure at the Si/Si–O interface. Because of rapidly shrinking neutron star radii and receding shock fronts after the passage through their maxima, our models exhibit short advection timescales, which favor the efficient growth of the standing accretion-shock instability. The latter plays a supportive role at least for the initiation of the re-expansion of the stalled shock before runaway. Taking into account the effects of turbulent pressure in the gain layer, we derive a generalized condition for the critical neutrino luminosity that captures the explosion behavior of all models very well. We validate the robustness of our findings by testing the influence of stochasticity, numerical resolution, and approximations in some aspects of the microphysics.
Resumo:
Otto-von-Guericke-Universität Magdeburg, Fakultät für Verfahrens- und Systemtechnik, Dissertation, 2016
Resumo:
Context. Recent observations of brown dwarf spectroscopic variability in the infrared infer the presence of patchy cloud cover. Aims. This paper proposes a mechanism for producing inhomogeneous cloud coverage due to the depletion of cloud particles through the Coulomb explosion of dust in atmospheric plasma regions. Charged dust grains Coulomb-explode when the electrostatic stress of the grain exceeds its mechanical tensile stress, which results in grains below a critical radius a < a Coul crit being broken up. Methods. This work outlines the criteria required for the Coulomb explosion of dust clouds in substellar atmospheres, the effect on the dust particle size distribution function, and the resulting radiative properties of the atmospheric regions. Results. Our results show that for an atmospheric plasma region with an electron temperature of Te = 10 eV (≈105 K), the critical grain radius varies from 10−7 to 10−4 cm, depending on the grains’ tensile strength. Higher critical radii up to 10−3 cm are attainable for higher electron temperatures. We find that the process produces a bimodal particle size distribution composed of stable nanoscale seed particles and dust particles with a ≥ a Coul crit , with the intervening particle sizes defining a region devoid of dust. As a result, the dust population is depleted, and the clouds become optically thin in the wavelength range 0.1–10 μm, with a characteristic peak that shifts to higher wavelengths as more sub-micrometer particles are destroyed. Conclusions. In an atmosphere populated with a distribution of plasma volumes, this will yield regions of contrasting radiative properties, thereby giving a source of inhomogeneous cloud coverage. The results presented here may also be relevant for dust in supernova remnants and protoplanetary disks.
Resumo:
In the past decade, Spain’s generous incentive system for renewable energy production attracted substantial foreign and national investment. However, when the global financial crisis hit, and the consequent reduction of electricity consumption, the incentives began to cause a tariff deficit in the electricity system, leading the Spanish government to cut back and then eliminate the incentives. In the wake of losses, international investors turned to investment arbitration, while national investors could only present their claims before Spanish courts. The result was a potential for differential treatment between national and foreign investors. This paper examines the incentive regime and the government’s changes to it in order to understand the investors’ claims and the reasoning that resulted in their rejections, both in national courts and in the only arbitration award issued up to now. The paper concludes with a discussion of the effect of the renewable energies situation on the investment arbitration debate within Spanish civil society.
Resumo:
Iconic and significant buildings are the common target of bombings by terrorists causing large numbers of casualties and extensive property damage. Recent incidents were external bomb attacks on multi-storey buildings with reinforced concrete frames. Under a blast load circumstance, crucial damage initiates at low level storeys in a building and may then lead to a progressive collapse of whole or part of the structure. It is therefore important to identify the critical initial influence regions along the height, width and depth of the building exposed to blast effects and the structure response in order to assess the vulnerability of the structure to disproportionate and progressive collapse. This paper discusses the blast response and the propagation of its effects on a two dimensional reinforced concrete (RC) frame, designed to withstand normal gravity loads. The explicit finite element code, LS DYNA is used for the analysis. A complete RC portal frame seven storeys by six bays is modelled with reinforcement details and appropriate materials to simulate strain rate effects. Explosion loads derived from standard manuals are applied as idealized triangular pressures on the column faces of the numerical models. The analysis reports the influence of blast propagation as displacements and material yielding of the structural elements in the RC frame. The effected regions are identified and classified according to the load cases. This information can be used to determine the vulnerability of multi-storey RC buildings to various external explosion scenarios and designing buildings to resist blast loads.
Resumo:
Over the last decade, the rapid growth and adoption of the World Wide Web has further exacerbated user needs for e±cient mechanisms for information and knowledge location, selection, and retrieval. How to gather useful and meaningful information from the Web becomes challenging to users. The capture of user information needs is key to delivering users' desired information, and user pro¯les can help to capture information needs. However, e®ectively acquiring user pro¯les is di±cult. It is argued that if user background knowledge can be speci¯ed by ontolo- gies, more accurate user pro¯les can be acquired and thus information needs can be captured e®ectively. Web users implicitly possess concept models that are obtained from their experience and education, and use the concept models in information gathering. Prior to this work, much research has attempted to use ontologies to specify user background knowledge and user concept models. However, these works have a drawback in that they cannot move beyond the subsumption of super - and sub-class structure to emphasising the speci¯c se- mantic relations in a single computational model. This has also been a challenge for years in the knowledge engineering community. Thus, using ontologies to represent user concept models and to acquire user pro¯les remains an unsolved problem in personalised Web information gathering and knowledge engineering. In this thesis, an ontology learning and mining model is proposed to acquire user pro¯les for personalised Web information gathering. The proposed compu- tational model emphasises the speci¯c is-a and part-of semantic relations in one computational model. The world knowledge and users' Local Instance Reposito- ries are used to attempt to discover and specify user background knowledge. From a world knowledge base, personalised ontologies are constructed by adopting au- tomatic or semi-automatic techniques to extract user interest concepts, focusing on user information needs. A multidimensional ontology mining method, Speci- ¯city and Exhaustivity, is also introduced in this thesis for analysing the user background knowledge discovered and speci¯ed in user personalised ontologies. The ontology learning and mining model is evaluated by comparing with human- based and state-of-the-art computational models in experiments, using a large, standard data set. The experimental results are promising for evaluation. The proposed ontology learning and mining model in this thesis helps to develop a better understanding of user pro¯le acquisition, thus providing better design of personalised Web information gathering systems. The contributions are increasingly signi¯cant, given both the rapid explosion of Web information in recent years and today's accessibility to the Internet and the full text world.
Resumo:
In this article some basic laboratory bench experiments are described that are useful for teaching high school students some of the basic principles of stellar astrophysics. For example, in one experiment, students slam a plastic water-filled bottle down onto a bench, ejecting water towards the ceiling illustrating the physics associated with a type II supernova explosion. In another experiment, students roll marbles up and down a double ramp in an attempt to get a marble to enter a tube half way up the slope, which illustrates quantum tunnelling in stellar cores. The experiments are reasonably low cost to either purchase or manufacture.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
about 82 million immigrants in the OECD area; and worldwide, there are about 191 million immigrants and displaced persons, and some 30-40 million unauthorised immigrants. Also according to recent OECD report, little in-depth research has been carried out to-date to help decision makers in government, business, and society at large, to better understand the complexities and wider consequences of future migration flows. Literatures have also indicated that the lack of a skilled population in muchneeded occupations in countries of destination have contributed to the need to bring in skilled foreign workers. Furthermore, despite current global financial crisis, some areas of occupation are in need of skilled workers such that in a job-scarce environment jobs become fewer and employers are more likely to demand skilled workers from both natives and immigrants. Global competition for labour is expected to intensify, especially for top talent, highly qualified and semi-skilled individuals. This exacerbate the problems faced by current skilled immigrants and skilled refugees, particularly those from non-main English speaking countries who are not employed at optimal skill level in countries of destination. The research study investigates whether skilled immigrants are being effectively utilised in their countries of destination, in the context of employment. In addition to skilled immigrants, data sampling will also include skilled refugees who, although arriving under the humanitarian program, possess formal qualifications from their country of origin. Underlying variables will be explored such as the strength of social capital or interpersonal ties; and human capital in terms of educational attainment and proficiency in the English Language. The aim of the study is to explain the relationship between the variables; and whether the variables influence the employment outcomes. A broad-ranging preliminary literature review has been undertaken to explore the substantial bodies of knowledge on skilled immigrants worldwide, including skilled refugees; and to investigate whether the utilisation issues are universal or specific to a country. In addition, preliminary empirical research and analysis has been undertaken, to set the research focus and to identify the problems beyond literature. Preliminary findings have indicated that immigrants and refugees from non-main English speaking countries are particularly impacted by employment issues regardless of their skills and qualifications acquired in their country of origins; compared with immigrants from main-English speaking countries. Preliminary findings from the literature review also indicate that gaps in knowledge still exist. Although the past two decades have witnessed a virtual explosion of theory and research on international migration, no in-depth research has been located that specifically links immigrants and refugees social and human capitals in terms of employment outcomes. This research study aims to fill these gaps in research; and subsequently contribute to contemporary body of knowledge in literatures on the utilisation of skilled immigrants and skilled refugees, specifically those from non-main English speaking countries. A mixed methods design will be used, which incorporates techniques from both quantitative and qualitative research traditions that will be triangulated at the end of the data collection stage.
Resumo:
Curriculum demands continue to increase on school education systems with teachers at the forefront of implementing syllabus requirements. Education is reported frequently as a solution to most societal problems and, as a result of the world’s information explosion, teachers are expected to cover more and more within teaching programs. How can teachers combine subjects in order to capitalise on the competing educational agendas within school timeframes? Fusing curricula requires the bonding of standards from two or more syllabuses. Both technology and ICT complement the learning of science. This study analyses selected examples of preservice teachers’ overviews for fusing science, technology and ICT. These program overviews focused on primary students and the achievement of two standards (one from science and one from either technology or ICT). These primary preservice teachers’ fused-curricula overviews included scientific concepts and related technology and/or ICT skills and knowledge. Findings indicated a range of innovative curriculum plans for teaching primary science through technology and ICT, demonstrating that these subjects can form cohesive links towards achieving the respective learning standards. Teachers can work more astutely by fusing curricula; however further professional development may be required to advance thinking about these processes. Bonding subjects through their learning standards can extend beyond previous integration or thematic work where standards may not have been assessed. Education systems need to articulate through syllabus documents how effective fusing of curricula can be achieved. It appears that education is a key avenue for addressing societal needs, problems and issues. Education is promoted as a universal solution, which has resulted in curriculum overload (Dare, Durand, Moeller, & Washington, 1997; Vinson, 2001). Societal and curriculum demands have placed added pressure on teachers with many extenuating education issues increasing teachers’ workloads (Mobilise for Public Education, 2002). For example, as Australia has weather conducive for outdoor activities, social problems and issues arise that are reported through the media calling for action; consequently schools have been involved in swimming programs, road and bicycle safety programs, and a wide range of activities that had been considered a parental responsibility in the past. Teachers are expected to plan, implement and assess these extra-curricula activities within their already overcrowded timetables. At the same stage, key learning areas (KLAs) such as science and technology are mandatory requirements within all Australian education systems. These systems have syllabuses outlining levels of content and the anticipated learning outcomes (also known as standards, essential learnings, and frameworks). Time allocated for teaching science in obviously an issue. In 2001, it was estimated that on average the time spent in teaching science in Australian Primary Schools was almost an hour per week (Goodrum, Hackling, & Rennie, 2001). More recently, a study undertaken in the U.S. reported a similar finding. More than 80% of the teachers in K-5 classrooms spent less than an hour teaching science (Dorph, Goldstein, Lee, et al., 2007). More importantly, 16% did not spend teaching science in their classrooms. Teachers need to learn to work smarter by optimising the use of their in-class time. Integration is proposed as one of the ways to address the issue of curriculum overload (Venville & Dawson, 2005; Vogler, 2003). Even though there may be a lack of definition for integration (Hurley, 2001), curriculum integration aims at covering key concepts in two or more subject areas within the same lesson (Buxton & Whatley, 2002). This implies covering the curriculum in less time than if the subjects were taught separately; therefore teachers should have more time to cover other educational issues. Expectedly, the reality can be decidedly different (e.g., Brophy & Alleman, 1991; Venville & Dawson, 2005). Nevertheless, teachers report that students expand their knowledge and skills as a result of subject integration (James, Lamb, Householder, & Bailey, 2000). There seems to be considerable value for integrating science with other KLAs besides aiming to address teaching workloads. Over two decades ago, Cohen and Staley (1982) claimed that integration can bring a subject into the primary curriculum that may be otherwise left out. Integrating science education aims to develop a more holistic perspective. Indeed, life is not neat components of stand-alone subjects; life integrates subject content in numerous ways, and curriculum integration can assist students to make these real-life connections (Burnett & Wichman, 1997). Science integration can provide the scope for real-life learning and the possibility of targeting students’ learning styles more effectively by providing more than one perspective (Hudson & Hudson, 2001). To illustrate, technology is essential to science education (Blueford & Rosenbloom, 2003; Board of Studies, 1999; Penick, 2002), and constructing technology immediately evokes a social purpose for such construction (Marker, 1992). For example, building a model windmill requires science and technology (Zubrowski, 2002) but has a key focus on sustainability and the social sciences. Science has the potential to be integrated with all KLAs (e.g., Cohen & Staley, 1982; Dobbs, 1995; James et al., 2000). Yet, “integration” appears to be a confusing term. Integration has an educational meaning focused on special education students being assimilated into mainstream classrooms. The word integration was used in the late seventies and generally focused around thematic approaches for teaching. For instance, a science theme about flight only has to have a student drawing a picture of plane to show integration; it did not connect the anticipated outcomes from science and art. The term “fusing curricula” presents a seamless bonding between two subjects; hence standards (or outcomes) need to be linked from both subjects. This also goes beyond just embedding one subject within another. Embedding implies that one subject is dominant, while fusing curricula proposes an equal mix of learning within both subject areas. Primary education in Queensland has eight KLAs, each with its established content and each with a proposed structure for levels of learning. Primary teachers attempt to cover these syllabus requirements across the eight KLAs in less than five hours a day, and between many of the extra-curricula activities occurring throughout a school year (e.g., Easter activities, Education Week, concerts, excursions, performances). In Australia, education systems have developed standards for all KLAs (e.g., Education Queensland, NSW Department of Education and Training, Victorian Education) usually designated by a code. In the late 1990’s (in Queensland), “core learning outcomes” for strands across all KLA’s. For example, LL2.1 for the Queensland Education science syllabus means Life and Living at Level 2 standard number 1. Thus, a teacher’s planning requires the inclusion of standards as indicated by the presiding syllabus. More recently, the core learning outcomes were replaced by “essential learnings”. They specify “what students should be taught and what is important for students to have opportunities to know, understand and be able to do” (Queensland Studies Authority, 2009, para. 1). Fusing science education with other KLAs may facilitate more efficient use of time and resources; however this type of planning needs to combine standards from two syllabuses. To further assist in facilitating sound pedagogical practices, there are models proposed for learning science, technology and other KLAs such as Bloom’s Taxonomy (Bloom, 1956), Productive Pedagogies (Education Queensland, 2004), de Bono’s Six Hats (de Bono, 1985), and Gardner’s Multiple Intelligences (Gardner, 1999) that imply, warrant, or necessitate fused curricula. Bybee’s 5 Es, for example, has five levels of learning (engage, explore, explain, elaborate, and evaluate; Bybee, 1997) can have the potential for fusing science and ICT standards.
Resumo:
The culture of mashups which is examined by the contributions collected in this volume is a symptom of a wider paradigm shift in our engagement with information – a term which should be understood here in its broadest sense, ranging from factual material to creative works. It is a shift which has been a long time coming and has had many precedents, from the collage art of the Dadaists in the 1920s to the music mixtapes of the 70s and 80s, and finally to the explosion of mashup‐style practices that was enabled by modern computing technologies.
Resumo:
The Guardian reportage of the United Kingdom Member of Parliament (MP) expenses scandal of 2009 used crowdsourcing and computational journalism techniques. Computational journalism can be broadly defined as the application of computer science techniques to the activities of journalism. Its foundation lies in computer assisted reporting techniques and its importance is increasing due to the: (a) increasing availability of large scale government datasets for scrutiny; (b) declining cost, increasing power and ease of use of data mining and filtering software; and Web 2.0; and (c) explosion of online public engagement and opinion.. This paper provides a case study of the Guardian MP expenses scandal reportage and reveals some key challenges and opportunities for digital journalism. It finds journalists may increasingly take an active role in understanding, interpreting, verifying and reporting clues or conclusions that arise from the interrogations of datasets (computational journalism). Secondly a distinction should be made between information reportage and computational journalism in the digital realm, just as a distinction might be made between citizen reporting and citizen journalism. Thirdly, an opportunity exists for online news providers to take a ‘curatorial’ role, selecting and making easily available the best data sources for readers to use (information reportage). These activities have always been fundamental to journalism, however the way in which they are undertaken may change. Findings from this paper may suggest opportunities and challenges for the implementation of computational journalism techniques in practice by digital Australian media providers, and further areas of research.
Resumo:
The explosion in use of online social networks is an important phenomenon that provides a new set of entrepreneurial opportunities. Emerging musicians have been among the first to exploit this new market opportunity – and indeed, many have used it successfully. A recent study Carter (2009) reveals that artists who earned the most returns had an online presence on multiple social online sites and services such as MySpace and Facebook. These web pages are leveraged to build fan bases and develop different types of revenue streams. Yet, little is currently known about discovery or exploitation of such opportunities.
Resumo:
The ‘anti- of ‘(Anti)Queer’ is a queer anti. In particle physics, a domain of science which was for a long time peddled as ultimately knowable, rational and objective, the postmodern turn has made everything queer (or chaotic, as the scientific version of this turn is perhaps more commonly named). This is a world where not only do two wrongs not make a right, but a negative and positive do not calmly cancel each other out to leave nothing, as mathematics might suggest. When matter meets with anti-matter, the resulting explosion can produce not only energy - heat and light? - but new matter. We live in a world whose very basics are no longer the electron and the positron, but an ever proliferating number of chaotic, unpredictable - queer? - subatomic particles. Some are ‘charmed’, others merely ‘strange’ . Weird science indeed. The ‘Anti-’ of ‘Anti-queer’ does not place itself neatly into binaries. This is not a refutation of all that queer has been or will be. It is explicitly a confrontation, a challenge, an attempt to take seriously not only the claims made for queer but the potent contradictions and silences which stand proudly when any attempt is made to write a history of the term. Specifically, ‘Anti-Queer’ is not Beyond Queer, the title of Bruce Bawer’s 1996 book which calmly and self-confidently explains the failings of queer, extols a return to a liberal political theory of cultural change and places its own marker on queer as a movement whose purpose has been served. We are not Beyond Queer. And if we are Anti-Queer, it is only to challenge those working in the arena to acknowledge and work with some of the facts of the movement’s history whose productivity has been erased with a gesture which has, proved, bizarrely, to be reductive and homogenising.