976 resultados para Inside-Outside Algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliability of carrier phase ambiguity resolution (AR) of an integer least-squares (ILS) problem depends on ambiguity success rate (ASR), which in practice can be well approximated by the success probability of integer bootstrapping solutions. With the current GPS constellation, sufficiently high ASR of geometry-based model can only be achievable at certain percentage of time. As a result, high reliability of AR cannot be assured by the single constellation. In the event of dual constellations system (DCS), for example, GPS and Beidou, which provide more satellites in view, users can expect significant performance benefits such as AR reliability and high precision positioning solutions. Simply using all the satellites in view for AR and positioning is a straightforward solution, but does not necessarily lead to high reliability as it is hoped. The paper presents an alternative approach that selects a subset of the visible satellites to achieve a higher reliability performance of the AR solutions in a multi-GNSS environment, instead of using all the satellites. Traditionally, satellite selection algorithms are mostly based on the position dilution of precision (PDOP) in order to meet accuracy requirements. In this contribution, some reliability criteria are introduced for GNSS satellite selection, and a novel satellite selection algorithm for reliable ambiguity resolution (SARA) is developed. The SARA algorithm allows receivers to select a subset of satellites for achieving high ASR such as above 0.99. Numerical results from a simulated dual constellation cases show that with the SARA procedure, the percentages of ASR values in excess of 0.99 and the percentages of ratio-test values passing the threshold 3 are both higher than those directly using all satellites in view, particularly in the case of dual-constellation, the percentages of ASRs (>0.99) and ratio-test values (>3) could be as high as 98.0 and 98.5 % respectively, compared to 18.1 and 25.0 % without satellite selection process. It is also worth noting that the implementation of SARA is simple and the computation time is low, which can be applied in most real-time data processing applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We revisit the venerable question of access credentials management, which concerns the techniques that we, humans with limited memory, must employ to safeguard our various access keys and tokens in a connected world. Although many existing solutions can be employed to protect a long secret using a short password, those solutions typically require certain assumptions on the distribution of the secret and/or the password, and are helpful against only a subset of the possible attackers. After briefly reviewing a variety of approaches, we propose a user-centric comprehensive model to capture the possible threats posed by online and offline attackers, from the outside and the inside, against the security of both the plaintext and the password. We then propose a few very simple protocols, adapted from the Ford-Kaliski server-assisted password generator and the Boldyreva unique blind signature in particular, that provide the best protection against all kinds of threats, for all distributions of secrets. We also quantify the concrete security of our approach in terms of online and offline password guesses made by outsiders and insiders, in the random-oracle model. The main contribution of this paper lies not in the technical novelty of the proposed solution, but in the identification of the problem and its model. Our results have an immediate and practical application for the real world: they show how to implement single-sign-on stateless roaming authentication for the internet, in a ad-hoc user-driven fashion that requires no change to protocols or infrastructure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a new protocol providing cryptographically secure authentication to unaided humans against passive adversaries. We also propose a new generic passive attack on human identification protocols. The attack is an application of Coppersmith’s baby-step giant-step algorithm on human identification protcols. Under this attack, the achievable security of some of the best candidates for human identification protocols in the literature is further reduced. We show that our protocol preserves similar usability while achieves better security than these protocols. A comprehensive security analysis is provided which suggests parameters guaranteeing desired levels of security.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a computational method for eliminating severe stress concentration at the unsupported railhead ends in rail joints through innovative shape optimization of the contact zone, which is complex due to near field nonlinear contact. With a view to minimizing the computational efforts, hybrid genetic algorithm method coupled with parametric finite element has been developed and compared with the traditional genetic algorithm (GA). The shape of railhead top surface where the wheel contacts nonlinearly was optimized using the hybridized GA method. Comparative study of the optimal result and the search efficiency between the traditional and hybrid GA methods has shown that the hybridized GA provides the optimal shape in fewer computational cycles without losing accuracy. The method will be beneficial to solving complex engineering problems involving contact nonlinearity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generalization of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics. Also, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement. The comparison results show that the computation using our mapper/reducer placement is much cheaper while still satisfying the computation deadline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NPcomplete. Thus, in this paper we propose a new grouping genetic algorithm for the mappers/reducers placement problem in cloud computing. Compared with the original one, our grouping genetic algorithm uses an innovative coding scheme and also eliminates the inversion operator which is an essential operator in the original grouping genetic algorithm. The new grouping genetic algorithm is evaluated by experiments and the experimental results show that it is much more efficient than four popular algorithms for the problem, including the original grouping genetic algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Software-as-a-Service or SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. Components in a composite SaaS may need to be scaled – replicated or deleted, to accommodate the user’s load. It may not be necessary to replicate all components of the SaaS, as some components can be shared by other instances. On the other hand, when the load is low, some of the instances may need to be deleted to avoid resource underutilisation. Thus, it is important to determine which components are to be scaled such that the performance of the SaaS is still maintained. Extensive research on the SaaS resource management in Cloud has not yet addressed the challenges of scaling process for composite SaaS. Therefore, a hybrid genetic algorithm is proposed in which it utilises the problem’s knowledge and explores the best combination of scaling plan for the components. Experimental results demonstrate that the proposed algorithm outperforms existing heuristic-based solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chinese immigrant entrepreneurs, known the world over for their successful business practices (Kee, 1994), tend to start businesses within their ethnic enclave. But in a move away from multiculturalism, host countries increasingly fear that immigration and asylum pose a threat to social integration resulting in a lack of social cohesion and a plethora of government programs (Cheong, Edwards, Goulbourne & Solomos, 2007). For many immigrant entrepreneurs, the EE is an integral part of their social and cultural context and the location where ethnic resources reside (Logan, Alba & Stults, 2003). Immigrant entrepreneurs can harness the networks for labor and customers through various ties in their EE (Portes and Zhou, 1996). Yang, Ho and Chang (2010) illustrate in their paper that the Chinese immigrant entrepreneurs (IE) were able to utilize ethnic network resources as their social capital in order to reduce transaction costs and thus enhance business performance. Tilly (1990) explains that immigrants’ reliance on such networks for business or other information minimizes the socioeconomic hardships they would experience in host countries (Raijman & Tienda, 2000). Acquiring jobs in ethnic businesses and establishing businesses within an EE may facilitate migrants’ social integration into the host country (Tian & Shan, 1999). Although an EE has distinct economic advantages for immigrant entrepreneurs, Sequeira and Rasheed (2006: 367) argue that ‘Exclusive reliance on strong ties within the immigrant enclave has a negative effect on growth outside the enclave community.’ Similarly, Drori, Honig and Ginsberg (2010: 20) also propose that ‘The greater the reliance of transnational entrepreneurs on ethnic (versus societal) embedded resources and network structure, the narrower their possibilities of expanding the scope of their business.’ This research asks, ‘What is the role of the ethnic enclave in facilitating immigrant business growth and social integration? This project has the following important aims: A1 To better understand the role of IE, in particular Chinese IE in the Australian economy A2 To investigate the role of the EE in facilitating or inhibiting immigrant business performance A3 To understand how locating their firm inside or outside of the EE will affect the IE’s embeddedness in co-ethnic and nonco-ethnic networks and social integration A4 To understand how an IE’s social network affects business performance and social integration

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Immigrant entrepreneurs tend to start businesses within their ethnic enclave (EE), as it is an integral part of their social and cultural context and the location where ethnic resources reside (Logan, Alba, & Stults, 2003). Ethnic enclaves can be seen as a form of geographic cluster, China Towns are exemplar EEs, easily identified by the clustering of Chinese restaurants and other ethnic businesses in one central location. Studies on EE thus far have neglected the life cycles stages of EE and its impact on the business experiences of the entrepreneurs. In this paper, we track the formation, growth and decline of a EE. We argue that EE is a special industrial cluster and as such it follows the growth conditions proposed by the cluster life cycle theory (Menzel & Fornahl, 2009). We report a mixed method study of Chinese Restaurants in South East Queensland. Based on multiple sources of data, we concluded that changes in government policies leading to a sharp increase of immigrant numbers from a distinctive culture group can lead to the initiation and growth of the EE. Continuous incoming of new immigrants and increase competition within the cluster mark the mature stage of the EE, making the growth condition more favourable “inside” the cluster. A decline in new immigrants from the same ethnic group and the increased competition within the EE may eventually lead to the decline of such an industrial cluster, thus providing more favorable condition for growth of business outside the cluster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluid–Structure Interaction (FSI) problem is significant in science and engineering, which leads to challenges for computational mechanics. The coupled model of Finite Element and Smoothed Particle Hydrodynamics (FE-SPH) is a robust technique for simulation of FSI problems. However, two important steps of neighbor searching and contact searching in the coupled FE-SPH model are extremely time-consuming. Point-In-Box (PIB) searching algorithm has been developed by Swegle to improve the efficiency of searching. However, it has a shortcoming that efficiency of searching can be significantly affected by the distribution of points (nodes in FEM and particles in SPH). In this paper, in order to improve the efficiency of searching, a novel Striped-PIB (S-PIB) searching algorithm is proposed to overcome the shortcoming of PIB algorithm that caused by points distribution, and the two time-consuming steps of neighbor searching and contact searching are integrated into one searching step. The accuracy and efficiency of the newly developed searching algorithm is studied on by efficiency test and FSI problems. It has been found that the newly developed model can significantly improve the computational efficiency and it is believed to be a powerful tool for the FSI analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: reading the signs Inside the dance ethos, knowledge is rarely articulated other than through the experience of dance itself. On the surface, the dancer focuses on practical and specialist skills. However, a closer look reveals that their knowledge does not merely trigger an embodied way of thinking; it enables the dancer to map a trail of metaphors within the body. In effect, dancers acquire a distinct embodied culture with its own language, dialects, customs and traditions. In this paper, I shall firstly examine the way metaphors establish a link between reason and imagination between one set of embodied knowledge and another. It is in regards to this function, where metaphor welds opposites together or when interior and exterior information exist in the same moment that it is most useful for jumping the fence from dance to cross-disciplinary practice. Secondly, I shall discuss how metaphors can help sustain creative practice. For it is only by stepping outside the culture of dance that I could first unravel the experiences, processes and knowledges inscribed through a career in dance and begin to define the quality of my own voice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Numerous studies demonstrate the generation and short-term survival of adipose tissue; however, long-term persistence remains elusive. This study evaluates long-term survival and transferability of de novo adipose constructs based on a ligated vascular pedicle and tissue engineering chamber combination. Methods Defined adipose tissue flaps were implanted into rats in either intact or perforated domed chambers. In half of the groups, the chambers were removed after 10 weeks and the constructs transferred on their vascular pedicle to a new site, where they were observed for a further 10 weeks. In the remaining groups, the tissue construct was observed for 20 weeks inside the chamber. Tissue volume was assessed using magnetic resonance imaging and histologic measures, and constructs were assessed for stability and necrosis. Sections were assessed histologically and for proliferation using Ki-67. Results At 20 weeks, volume analysis revealed an increase in adipose volume from 0.04 ± 0.001 ml at the time of insertion into the chambers to 0.27 ± 0.004 ml in the closed and 0.44 ± 0.014 ml in the perforated chambers. There was an additional increase of approximately 10 to 15 percent in tissue volume in flaps that remained in chambers for 20 weeks, whereas the volume of the transferred tissue not in chambers remained unaltered. Histomorphometric assessment of the tissues documented no signs of hypertrophy, fat necrosis, or atypical changes of the newly generated tissue. Conclusion This study presents a promising new method of generating significant amounts of mature, vascularized, stable, and transferable adipose tissue for permanent autologous soft-tissue replacement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel method of spontaneous generation of new adipose tissue from an existing fat flap is described. A defined volume of fat flap based on the superficial inferior epigastric vascular pedicle in the rat was elevated and inset into a hollow plastic chamber implanted subcutaneously in the groin of the rat. The chamber walls were either perforated or solid and the chambers either contained poly(D,L-lactic-co-glycolic acid) (PLGA) sponge matrix or not. The contents were analyzed after being in situ for 6 weeks. The total volume of the flap tissue in all groups except the control groups, where the flap was not inserted into the chambers, increased significantly, especially in the perforated chambers (0.08 ± 0.007 mL baseline compared to 1.2 ± 0.08 mL in the intact ones). Volume analysis of individual component tissues within the flaps revealed that the adipocyte volume increased and was at a maximum in the chambers without PLGA, where it expanded from 0.04 ± 0.003 mL at insertion to 0.5 ± 0.08 mL (1250% increase) in the perforated chambers and to 0.16 ± 0.03 mL (400% increase) in the intact chambers. Addition of PLGA scaffolds resulted in less fat growth. Histomorphometric analysis rather than simple hypertrophy documented an increased number of adipocytes. The new tissue was highly vascularized and no fat necrosis or atypical changes were observed.