991 resultados para Bubble rise velocity
Resumo:
We present 65 optical spectra of the Type Ia supernova SN 2012fr, of which 33 were obtained before maximum light. At early times SN 2012fr shows clear evidence of a high-velocity feature (HVF) in the Si II 6355 line which can be cleanly decoupled from the lower velocity "photospheric" component. This Si II 6355 HVF fades by phase -5; subsequently, the photospheric component exhibits a very narrow velocity width and remains at a nearly constant velocity of v~12,000 km/s until at least 5 weeks after maximum brightness. The Ca II infrared (IR) triplet exhibits similar evidence for both a photospheric component at v~12,000 km/s with narrow line width and long velocity plateau, as well as a high-velocity component beginning at v~31,000 km/s two weeks before maximum. SN 2012fr resides on the border between the "shallow silicon" and "core-normal" subclasses in the Branch et al. (2009) classification scheme, and on the border between normal and "high-velocity" SNe Ia in the Wang et al. (2009a) system. Though it is a clear member of the "low velocity gradient" (LVG; Benetii et al., 2005) group of SNe Ia and exhibits a very slow light-curve decline, it shows key dissimilarities with the overluminous SN 1991T or SN 1999aa subclasses of SNe Ia. SN 2012fr represents a well-observed SN Ia at the luminous end of the normal SN Ia distribution, and a key transitional event between nominal spectroscopic subclasses of SNe Ia.
Resumo:
We present a comparison of two Suzaku X-ray observations of the nearby (z = 0.184), luminous (L ∼ 10 erg s) type I quasar, PDS 456. A new 125 ks Suzaku observation in 2011 caught the quasar during a period of low X-ray flux and with a hard X-ray spectrum, in contrast with a previous 190 ks Suzaku observation in 2007 when the quasar appeared brighter and had a steep (Γ > 2) X-ray spectrum. The 2011 X-ray spectrum contains a pronounced trough near 9 keV in the quasar rest frame, which can be modeled with blueshifted iron K-shell absorption, most likely from the He- and H-like transitions of iron. The absorption trough is observed at a similar rest-frame energy as in the earlier 2007 observation, which appears to confirm the existence of a persistent high-velocity wind in PDS 456, at an outflow velocity of 0.25-0.30c. The spectral variability between 2007 and 2011 can be accounted for by variations in a partial covering absorber, increasing in covering fraction from the brighter 2007 observation to the hard and faint 2011 observation. Overall, the low-flux 2011 observation can be explained if PDS 456 is observed at relatively low inclination angles through a Compton-thick wind, originating from the accretion disk, which significantly attenuates the X-ray flux from the quasar. © 2014. The American Astronomical Society. All rights reserved.
Resumo:
In the aftermath of the financial crash of 2008, policy makers operating in international financial regulatory networks discovered macroprudential regulation (MPR), but macroprudential regulation has had a stunted or arrested development that can be explained with reference to five factors that are recounted in this article
Resumo:
Thermocouples are one of the most popular devices for temperature measurement due to their robustness, ease of manufacture and installation, and low cost. However, when used in certain harsh environments, for example, in combustion systems and engine exhausts, large wire diameters are required, and consequently the measurement bandwidth is reduced. This article discusses a software compensation technique to address the loss of high frequency fluctuations based on measurements from two thermocouples. In particular, a difference equation (DE) approach is proposed and compared with existing methods both in simulation and on experimental test rig data with constant flow velocity. It is found that the DE algorithm, combined with the use of generalized total least squares for parameter identification, provides better performance in terms of time constant estimation without any a priori assumption on the time constant ratios of the thermocouples.
Resumo:
No abstract available
Resumo:
The determination of the efflux velocity is key to the process of calculating the subsequent value of velocity at any other location within a propeller jet. This paper reports on the findings of an experimental investigation into the magnitude of the efflux velocities within the jets produced by four differing propellers. Measurements of velocity have been made using a 3D LDA system with the test propellers operating at a range of rotational speeds which bound typical operational values. Comparisons are made with existing predictive theories and to aid design engineers, methods are presented by which the 3D efflux velocity components, as well as the resultant efflux value, can be determined.
Resumo:
The accurate definition of the extreme wave loads which act on offshore structures represents a significant challenge for design engineers and even with decades of empirical data to base designs upon there are still failures attributed to wave loading. The environmental conditions which cause these loads are infrequent and highly non-linear which means that they are not well understood or simple to describe. If the structure is large enough to affect the incident wave significantly further non-linear effects can influence the loading. Moreover if the structure is floating and excited by the wave field then its responses, which are also likely to be highly non-linear, must be included in the analysis. This makes the description of the loading on such a structure difficult to determine and the design codes will often suggest employing various tools including small scale experiments, numerical and analytical methods, as well as empirical data if available.
Wave Energy Converters (WECs) are a new class of offshore structure which pose new design challenges, lacking the design codes and empirical data found in other industries. These machines are located in highly exposed and energetic sites, designed to be excited by the waves and will be expected to withstand extreme conditions over their 25 year design life. One such WEC is being developed by Aquamarine Power Ltd and is called Oyster. Oyster is a buoyant flap which is hinged close to the seabed, in water depths of 10 to 15m, piercing the water surface. The flap is driven back and forth by the action of the waves and this mechanical energy is then converted to electricity.
It has been identified in previous experiments that Oyster is not only subject to wave impacts but it occasionally slams into the water surface with high angular velocity. This slamming effect has been identified as an extreme load case and work is ongoing to describe it in terms of the pressure exerted on the outer skin and the transfer of this short duration impulsive load through various parts of the structure.
This paper describes a series of 40th scale experiments undertaken to investigate the pressure on the face of the flap during the slamming event. A vertical array of pressure sensors are used to measure the pressure exerted on the flap. Characteristics of the slam pressure such as the rise time, magnitude, spatial distribution and temporal evolution are revealed. Similarities are drawn between this slamming phenomenon and the classical water entry problems, such as ship hull slamming. With this similitude identified, common analytical tools are used to predict the slam pressure which is compared to that measured in the experiment.
Resumo:
The principle feature in the evolution of the internet has been its ever growing reach to include old and young, rich and poor. The internet’s ever encroaching presence has transported it from our desktop to our pocket and into our glasses. This is illustrated in the Internet Society Questionnaire on Multistakeholder Governance, which found the main factors affecting change in the Internet governance landscape were more users online from more countries and the influence of the internet over daily life. The omnipresence of the internet is self- perpetuating; its usefulness grows with every new user and every new piece of data uploaded. The advent of social media and the creation of a virtual presence for each of us, even when we are not physically present or ‘logged on’, means we are fast approaching the point where we are all connected, to everyone else, all the time. We have moved far beyond the point where governments can claim to represent our views which evolve constantly rather than being measured in electoral cycles.
The shift, which has seen citizens as creators of content rather than consumers of it, has undermined the centralist view of democracy and created an environment of wiki democracy or crowd sourced democracy. This is at the heart of what is generally known as Web 2.0, and widely considered to be a positive, democratising force. However, we argue, there are worrying elements here too. Government does not always deliver on the promise of the networked society as it involves citizens and others in the process of government. Also a number of key internet companies have emerged as powerful intermediaries harnessing the efforts of the many, and re- using and re-selling the products and data of content providers in the Web 2.0 environment. A discourse about openness and transparency has been offered as a democratising rationale but much of this masks an uneven relationship where the value of online activity flows not to the creators of content but to those who own the channels of communication and the metadata that they produce.
In this context the state is just one stakeholder in the mix of influencers and opinion formers impacting on our behaviours, and indeed our ideas of what is public. The question of what it means to create or own something, and how all these new relationships to be ordered and governed are subject to fundamental change. While government can often appear slow, unwieldy and even irrelevant in much of this context, there remains a need for some sort of political control to deal with the challenges that technology creates but cannot by itself control. In order for the internet to continue to evolve successfully both technically and socially it is critical that the multistakeholder nature of internet governance be understood and acknowledged, and perhaps to an extent, re- balanced. Stakeholders can no longer be classified in the broad headings of government, private sector and civil society, and their roles seen as some sort of benign and open co-production. Each user of the internet has a stake in its efficacy and each by their presence and participation is contributing to the experience, positive or negative of other users as well as to the commercial success or otherwise of various online service providers. However stakeholders have neither an equal role nor an equal share. The unequal relationship between the providers of content and those who simple package up and transmit that content - while harvesting the valuable data thus produced - needs to be addressed. Arguably this suggests a role for government that involves it moving beyond simply celebrating and facilitating the on- going technological revolution. This paper reviews the shifting landscape of stakeholders and their contribution to the efficacy of the internet. It will look to critically evaluate the primacy of the individual as the key stakeholder and their supposed developing empowerment within the ever growing sea of data. It also looks at the role of individuals in wider governance roles. Governments in a number of jurisdictions have sought to engage, consult or empower citizens through technology but in general these attempts have had little appeal. Citizens have been too busy engaging, consulting and empowering each other to pay much attention to what their governments are up to. George Orwell’s view of the future has not come to pass; in fact the internet has insured the opposite scenario has come to pass. There is no big brother but we are all looking over each other’s shoulder all the time, while at the same time a number of big corporations are capturing and selling all this collective endeavour back to us.
Resumo:
Low-velocity impact damage can drastically reduce the residual mechanical properties of the composite structure even when there is barely visible impact damage. The ability to computationally predict the extent of damage and compression after impact (CAI) strength of a composite structure can potentially lead to the exploration of a larger design space without incurring significant development time and cost penalties. A three-dimensional damage model, to predict both low-velocity impact damage and compression after impact CAI strength of composite laminates, has been developed and implemented as a user material subroutine in the commercial finite element package, ABAQUS/Explicit. The virtual tests were executed in two steps, one to capture the impact damage and the other to predict the CAI strength. The observed intra-laminar damage features, delamination damage area as well as residual strength are discussed. It is shown that the predicted results for impact damage and CAI strength correlated well with experimental testing.