295 resultados para Inside-Outside Algorithm
Resumo:
Fluid–Structure Interaction (FSI) problem is significant in science and engineering, which leads to challenges for computational mechanics. The coupled model of Finite Element and Smoothed Particle Hydrodynamics (FE-SPH) is a robust technique for simulation of FSI problems. However, two important steps of neighbor searching and contact searching in the coupled FE-SPH model are extremely time-consuming. Point-In-Box (PIB) searching algorithm has been developed by Swegle to improve the efficiency of searching. However, it has a shortcoming that efficiency of searching can be significantly affected by the distribution of points (nodes in FEM and particles in SPH). In this paper, in order to improve the efficiency of searching, a novel Striped-PIB (S-PIB) searching algorithm is proposed to overcome the shortcoming of PIB algorithm that caused by points distribution, and the two time-consuming steps of neighbor searching and contact searching are integrated into one searching step. The accuracy and efficiency of the newly developed searching algorithm is studied on by efficiency test and FSI problems. It has been found that the newly developed model can significantly improve the computational efficiency and it is believed to be a powerful tool for the FSI analysis.
Resumo:
Introduction: reading the signs Inside the dance ethos, knowledge is rarely articulated other than through the experience of dance itself. On the surface, the dancer focuses on practical and specialist skills. However, a closer look reveals that their knowledge does not merely trigger an embodied way of thinking; it enables the dancer to map a trail of metaphors within the body. In effect, dancers acquire a distinct embodied culture with its own language, dialects, customs and traditions. In this paper, I shall firstly examine the way metaphors establish a link between reason and imagination between one set of embodied knowledge and another. It is in regards to this function, where metaphor welds opposites together or when interior and exterior information exist in the same moment that it is most useful for jumping the fence from dance to cross-disciplinary practice. Secondly, I shall discuss how metaphors can help sustain creative practice. For it is only by stepping outside the culture of dance that I could first unravel the experiences, processes and knowledges inscribed through a career in dance and begin to define the quality of my own voice.
Resumo:
Background Numerous studies demonstrate the generation and short-term survival of adipose tissue; however, long-term persistence remains elusive. This study evaluates long-term survival and transferability of de novo adipose constructs based on a ligated vascular pedicle and tissue engineering chamber combination. Methods Defined adipose tissue flaps were implanted into rats in either intact or perforated domed chambers. In half of the groups, the chambers were removed after 10 weeks and the constructs transferred on their vascular pedicle to a new site, where they were observed for a further 10 weeks. In the remaining groups, the tissue construct was observed for 20 weeks inside the chamber. Tissue volume was assessed using magnetic resonance imaging and histologic measures, and constructs were assessed for stability and necrosis. Sections were assessed histologically and for proliferation using Ki-67. Results At 20 weeks, volume analysis revealed an increase in adipose volume from 0.04 ± 0.001 ml at the time of insertion into the chambers to 0.27 ± 0.004 ml in the closed and 0.44 ± 0.014 ml in the perforated chambers. There was an additional increase of approximately 10 to 15 percent in tissue volume in flaps that remained in chambers for 20 weeks, whereas the volume of the transferred tissue not in chambers remained unaltered. Histomorphometric assessment of the tissues documented no signs of hypertrophy, fat necrosis, or atypical changes of the newly generated tissue. Conclusion This study presents a promising new method of generating significant amounts of mature, vascularized, stable, and transferable adipose tissue for permanent autologous soft-tissue replacement.
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
A novel method of spontaneous generation of new adipose tissue from an existing fat flap is described. A defined volume of fat flap based on the superficial inferior epigastric vascular pedicle in the rat was elevated and inset into a hollow plastic chamber implanted subcutaneously in the groin of the rat. The chamber walls were either perforated or solid and the chambers either contained poly(D,L-lactic-co-glycolic acid) (PLGA) sponge matrix or not. The contents were analyzed after being in situ for 6 weeks. The total volume of the flap tissue in all groups except the control groups, where the flap was not inserted into the chambers, increased significantly, especially in the perforated chambers (0.08 ± 0.007 mL baseline compared to 1.2 ± 0.08 mL in the intact ones). Volume analysis of individual component tissues within the flaps revealed that the adipocyte volume increased and was at a maximum in the chambers without PLGA, where it expanded from 0.04 ± 0.003 mL at insertion to 0.5 ± 0.08 mL (1250% increase) in the perforated chambers and to 0.16 ± 0.03 mL (400% increase) in the intact chambers. Addition of PLGA scaffolds resulted in less fat growth. Histomorphometric analysis rather than simple hypertrophy documented an increased number of adipocytes. The new tissue was highly vascularized and no fat necrosis or atypical changes were observed.
Resumo:
Numerous initiatives have been employed around the world in order to address rising greenhouse gas (GHG) emissions originating from the transport sector. These measures include: travel demand management (congestion‐charging), increased fuel taxes, alternative fuel subsidies and low‐emission vehicle (LEV) rebates. Incentivizing the purchase of LEVs has been one of the more prevalent approaches in attempting to tackle this global issue. LEVs, whilst having the advantage of lower emissions and, in some cases, more efficient fuel consumption, also bring the downsides of increased purchase cost, reduced convenience of vehicle fuelling, and operational uncertainty. To stimulate demand in the face of these challenges, various incentive‐based policies, such as toll exemptions, have been used by national and local governments to encourage the purchase of these types of vehicles. In order to address rising GHG emissions in Stockholm, and in line with the Swedish Government’s ambition to operate a fossil free fleet by 2030, a number of policies were implemented targeting the transport sector. Foremost amongst these was the combination of a congestion charge – initiated to discourage emissions‐intensive travel – and an exemption from this charge for some LEVs, established to encourage a transition towards a ‘green’ vehicle fleet. Although both policies shared the aim of reducing GHG emissions, the exemption for LEVs carried the risk of diminishing the effectiveness of the congestion charging scheme. As the number of vehicle owners choosing to transition to an eligible LEV increased, the congestion‐reduction effectiveness of the charging scheme weakened. In fact, policy makers quickly recognized this potential issue and consequently phased out the LEV exemption less than 18 months after its introduction (1). Several studies have investigated the demand for LEVs through stated‐preference (SP) surveys across multiple countries, including: Denmark (2), Germany (3, 4), UK (5), Canada (6), USA (7, 8) and Australia (9). Although each of these studies differed in approach, all involved SP surveys where differing characteristics between various types of vehicles, including LEVs, were presented to respondents and these respondents in turn made hypothetical decisions about which vehicle they would be most likely to purchase. Although these studies revealed a number of interesting findings in regards to the potential demand for LEVs, they relied on SP data. In contrast, this paper employs an approach where LEV choice is modelled by taking a retrospective view and by using revealed preference (RP) data. By examining the revealed preferences of vehicle owners in Stockholm, this study overcomes one of the principal limitations of SP data, namely that stated preferences may not in fact reflect individuals’ actual choices, such as when cost, time, and inconvenience factors are real rather than hypothetical. This paper’s RP approach involves modelling the characteristics of individuals who purchased new LEVs, whilst estimating the effect of the congestion charging exemption upon choice probabilities and subsequent aggregate demand. The paper contributes to the current literature by examining the effectiveness of a toll exemption under revealed preference conditions, and by assessing the total effect of the policy based on key indicators for policy makers, including: vehicle owner home location, commuting patterns, number of children, age, gender and income. Extended Abstract Submission for Kuhmo Nectar Conference 2014 2 The two main research questions motivating this study were: Which individuals chose to purchase a new LEV in Stockholm in 2008?; and, How did the congestion charging exemption affect the aggregate demand for new LEVs in Stockholm in 2008? In order to answer these research questions the analysis was split into two stages. Firstly, a multinomial logit (MNL) model was used to identify which demographic characteristics were most significantly related to the purchase of an LEV over a conventional vehicle. The three most significant variables were found to be: intra‐cordon residency (positive); commuting across the cordon (positive); and distance of residence from the cordon (negative). In order to estimate the effect of the exemption policy on vehicle purchase choice, the model included variables to control for geographic differences in preferences, based on the location of the vehicle owners’ homes and workplaces in relation to the congestion‐charging cordon boundary. These variables included one indicator representing commutes across the cordon and another indicator representing intra‐cordon residency. The effect of the exemption policy on the probability of purchasing LEVs was estimated in the second stage of the analysis by focusing on the groups of vehicle owners that were most likely to have been affected by the policy i.e. those commuting across the cordon boundary (in both directions). Given the inclusion of the indicator variable representing commutes across the cordon, it is assumed that the estimated coefficient of this variable captures the effect of the exemption policy on the utility of choosing to purchase an exempt LEV for these two groups of vehicle owners. The intra‐cordon residency indicator variable also controls for differences between the two groups, based upon direction of travel across the cordon boundary. A counter‐hypothesis to this assumption is that the coefficient of the variable representing commuting across the cordon boundary instead only captures geo‐demographic differences that lead to variations in LEV ownership across the different groups of vehicle owners in relation to the cordon boundary. In order to address this counter‐hypothesis, an additional analysis was performed on data from a city with a similar geodemographic pattern to Stockholm, Gothenburg ‐ Sweden’s second largest city. The results of this analysis provided evidence to support the argument that the coefficient of the variable representing commutes across the cordon was capturing the effect of the exemption policy. Based upon this framework, the predicted vehicle type shares were calculated using the estimated coefficients of the MNL model and compared with predicted vehicle type shares from a simulated scenario where the exemption policy was inactive. This simulated scenario was constructed by setting the coefficient for the variable representing commutes across the cordon boundary to zero for all observations to remove the utility benefit of the exemption policy. Overall, the procedure of this second stage of the analysis led to results showing that the exemption had a substantial effect upon the probability of purchasing and aggregate demand for exempt LEVs in Stockholm during 2008. By making use of unique evidence of revealed preferences of LEV owners, this study identifies the common characteristics of new LEV owners and estimates the effect of Stockholm's congestion charging exemption upon the demand for new LEVs during 2008. It was found that the variables that had the greatest effect upon the choice of purchasing an exempt LEV included intra‐cordon residency (positive), distance of home from the cordon (negative), and commuting across the cordon (positive). It was also determined that owners under the age of 30 years preferred non‐exempt LEVs (low CO2 LEVs), whilst those over the age of 30 years preferred electric vehicles. In terms of electric vehicles, it was apparent that those individuals living within the city had the highest propensity towards purchasing this vehicle type. A negative relationship between choosing an electric vehicle and the distance of an individuals’ residency from the cordon was also evident. Overall, the congestion charging exemption was found to have increased the share of exempt LEVs in Stockholm by 1.9%, with, as expected, a much stronger effect on those commuting across the boundary, with those living inside the cordon having a 13.1% increase, and those owners living outside the cordon having a 5.0% increase. This increase in demand corresponded to an additional 538 (+/‐ 93; 95% C.I.) new exempt LEVs purchased in Stockholm during 2008 (out of a total of 5 427; 9.9%). Policy makers can take note that an incentive‐based policy can increase the demand for LEVs and appears to be an appropriate approach to adopt when attempting to reduce transport emissions through encouraging a transition towards a ‘green’ vehicle fleet.
Resumo:
Tissue engineering is a multidisciplinary field with the potential to replace tissues lost as a result of trauma, cancer surgery, or organ dysfunction. The successful production, integration, and maintenance of any tissue-engineered product are a result of numerous molecular interactions inside and outside the cell. We consider the essential elements for successful tissue engineering to be a matrix scaffold, space, cells, and vasculature, each of which has a significant and distinct molecular underpinning (Fig. 1). Our approach capitalizes on these elements. Originally developed in the rat, our chamber model (Fig. 2) involves the placement of an arteriovenous loop (the vascular supply) in a polycarbonate chamber (protected space) with the addition of cells and an extracellular matrix such as Matrigel or endogenous fibrin (34, 153, 246, 247). This model has also been extended to the rabbit and pig (J. Dolderer, M. Findlay, W. Morrison, manuscript in preparation), and has been modified for the mouse to grow adipose tissue and islet cells (33, 114, 122) (Fig. 3)...
Resumo:
This chapter addresses the radical paucity of empirical data about the career destinations of journalism, media and communications graduates from degree programs. We report findings from a study of ten years of graduates from Queensland University of Technology’s courses in journalism, media, and communication studies, using a ‘Creative Trident’ lens to analyse micro individual survey data. The study findings engage with creative labour precarity discussions, and also assertions of creative graduate oversupply suggested by national graduate outcome statistics. We describe the graduates’ employment outcomes, characterise their early career movements into and out of embedded and specialist employment, and compare the capability requirements and degree of course relevance reported by graduates employed in the different Trident segments. Given that in general the graduates in this study enjoyed very positive employment outcomes, but that there were systematic differences in reported course relevance by segment of employment and role, we also consider how university programs can best engage with the task of educating students for a surprisingly diverse range of media and communication-related occupational outcomes within and outside the creative industries.
Resumo:
This report studies an algebraic equation whose solution gives the image system of a source of light as seen by an observer inside a reflecting spherical surface. The equation is looked at numerically using GeoGebra. Under the hypothesis that our galaxy is enveloped by a reflecting interface this becomes a possible model for many mysterious extra galactic observations.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
Two kinds of floating electrode, floating dielectric barrier covered electrode (FDBCE) and floating pin electrode (FPE), which can enhance the performance of plasma jet are reported. The intense discharge between the floating electrode and power electrode decreased the voltage to trigger the plasma jet substantially. The transition of plasma bullet from ring shape to disk shape in the high helium concentration region happened when the floating electrode was totally inside the powered ring electrode. The enhanced electric field between propagating plasma bullet and ground electrode is the reason for this transition. The double plasma bullets happened when part of the FDBCE was outside the powered ring electrode, which is attributed to the structure and surface charge of FDBCE. As part of the FPE was outside the powered ring electrode, the return stroke resulted in a single intensified plasma channel between FPE and ground electrode.
Resumo:
This paper offers numerical modelling of a waste heat recovery system. A thin layer of metal foam is attached to a cold plate to absorb heat from hot gases leaving the system. The heat transferred from the exhaust gas is then transferred to a cold liquid flowing in a secondary loop. Two different foam PPI (Pores Per Inch) values are examined over a range of fluid velocities. Numerical results are then compared to both experimental data and theoretical results available in the literature. Challenges in getting the simulation results to match those of the experiments are addressed and discussed in detail. In particular, interface boundary conditions specified between a porous layer and a fluid layer are investigated. While physically one expects much lower fluid velocity in the pores compared to that of free flow, capturing this sharp gradient at the interface can add to the difficulties of numerical simulation. The existing models in the literature are modified by considering the pressure gradient inside and outside the foam. Comparisons against the numerical modelling are presented. Finally, based on experimentally-validated numerical results, thermo-hydraulic performance of foam heat exchangers as waste heat recovery units is discussed with the main goal of reducing the excess pressure drop and maximising the amount of heat that can be recovered from the hot gas stream.
Resumo:
Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.