162 resultados para Biology computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is exciting to be living at a time when the big questions in biology can be investigated using modern genetics and computing [1]. Bauzà-Ribot et al.[2] take on one of the fundamental drivers of biodiversity, the effect of continental drift in the formation of the world’s biota 3 and 4, employing next-generation sequencing of whole mitochondrial genomes and modern Bayesian relaxed molecular clock analysis. Bauzà-Ribot et al.[2] conclude that vicariance via plate tectonics best explains the genetic divergence between subterranean metacrangonyctid amphipods currently found on islands separated by the Atlantic Ocean. This finding is a big deal in biogeography, and science generally [3], as many other presumed biotic tectonic divergences have been explained as probably due to more recent transoceanic dispersal events [4]. However, molecular clocks can be problematic 5 and 6 and we have identified three issues with the analyses of Bauzà-Ribot et al.[2] that cast serious doubt on their results and conclusions. When we reanalyzed their mitochondrial data and attempted to account for problems with calibration 5 and 6, modeling rates across branches 5 and 7 and substitution saturation [5], we inferred a much younger date for their key node. This implies either a later trans-Atlantic dispersal of these crustaceans, or more likely a series of later invasions of freshwaters from a common marine ancestor, but either way probably not ancient tectonic plate movements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Migraine is a common genetically linked neurovascular disorder. Approximately ~12% of the Caucasian population are affected including 18% of adult women and 6% of adult men (1, 2). A notable female bias is observed in migraine prevalence studies with females affected ~3 times more than males and is credited to differences in hormone levels arising from reproductive achievements. Migraine is extremely debilitating with wide-ranging socioeconomic impact significantly affecting people's health and quality of life. A number of neurotransmitter systems have been implicated in migraine, the most studied include the serotonergic and dopaminergic systems. Extensive genetic research has been carried out to identify genetic variants that may alter the activity of a number of genes involved in synthesis and transport of neurotransmitters of these systems. The biology of the Glutamatergic system in migraine is the least studied however there is mounting evidence that its constituents could contribute to migraine. The discovery of antagonists that selectively block glutamate receptors has enabled studies on the physiologic role of glutamate, on one hand, and opened new perspectives pertaining to the potential therapeutic applications of glutamate receptor antagonists in diverse neurologic diseases. In this brief review, we discuss the biology of the Glutamatergic system in migraine outlining recent findings that support a role for altered Glutamatergic neurotransmission from biochemical and genetic studies in the manifestation of migraine and the implications of this on migraine treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to provide an evolutionary perspective of cloud computing (CC) by integrating two previously disparate literatures: CC and information technology outsourcing (ITO). We review the literature and develop a framework that highlights the demand for the CC service, benefits, risks, as well as risk mitigation strategies that are likely to influence the success of the service. CC success in organisations and as a technology overall is a function of (i) the outsourcing decision and supplier selection, (ii) contractual and relational governance, and (iii) industry standards and legal framework. Whereas CC clients have little control over standards and/or the legal framework, they are able to influence other factors to maximize the benefits while limiting the risks. This paper provides guidelines for (potential) cloud computing users with respect to the outsourcing decision, vendor selection, service-level-agreements, and other issues that need to be addressed when opting for CC services. We contribute to the literature by providing an evolutionary and holistic view of CC that draws on the extensive literature and theory of ITO. We conclude the paper with a number of research paths that future researchers can follow to advance the knowledge in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Transfusion-related acute lung injury (TRALI) is a serious and potentially fatal consequence of transfusion. A two-event TRALI model demonstrated date-of-expiry - day (D) 5 platelet (PLT) and D42 packed red blood cell (PRBC) supernatants (SN) induced TRALI in LPS-treated sheep. We have adapted a whole blood transfusion culture model as an investigative bridge between the ovine TRALI model human responses to transfusion. Methods A whole blood transfusion model was adapted to replicate the ovine model - specifically +/- 0.23μg/mL LPS as the first event and 10% SN volume (transfusion) as the second event. Four pooled SN from blood products, previously used in the TRALI ovine model, were investigated: D1-PLT, D5-PLT, D1-PRBC, and D42-PRBC. Fresh human whole blood (recipient) was mixed with combinations of LPS and BP-SN stimuli and incubated in vitro for 6 hrs. Addition of golgi plug enabled measurement of monocyte cytokine production (IL-6, IL-8, IL-10, IL-12, TNF-α, IL-1α, CXCL-5, IP-10, MIP-1α, MCP-1) using multi-colour flow cytometry. Responses for 6 recipients were assessed. Results In the presence of LPS, D42-PRBC-SN significantly increased monocyte IL-6 (P=0.031), IL-8 (P=0.016) and IL-1α (P=0.008) production compared to D1-PRBC-SN. This response to D42-PRBC-SN was LPS-dependent, and was not evident in non-LPSstimulated controls. This response was also specific to D42-PRBC-SN, as similar changes were not evident for the D5-PLT-SN, compared to the D1-PLT-SN, regardless of the presence of LPS. D5-PLT-SN significantly increased IL-12 production (P=0.024) compared to D1-PLT-SN. This response was again LPS-dependent. Conclusions These data demonstrate a novel two-event mechanism of monocyte inflammatory response that was dependent upon both the presence of date-of-expiry blood product SN and LPS. Further, these results demonstrate different cytokines responses induced by date-of-expiry PLT-SN and PRBC-SN. These data are consistent with the evidence from the ovine TRALI model, and enhancing its relevance to transfusion related changes in humans.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this commentary the authors discuss the molecular basis of the training adaptation and review the role of several key signaling proteins important in the adaptation to endurance and resistance training.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This special issue of the Journal of Urban Technology brings together five articles that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer- Human Interaction conference (OZCHI 2009). Our own article introduces the Street Computing vision and explores the potential, challenges, and foundations of this research trajectory. In order to do so, we first look at the currently available sources of information and discuss their link to existing research efforts. Section 2 then introduces the notion of Street Computing and our research approach in more detail. Section 3 looks beyond the core concept itself and summarizes related work in this field of interest. We conclude by introducing the papers that have been contributed to this special issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives This study introduces and assesses the precision of a standardized protocol for anthropometric measurement of the juvenile cranium using three-dimensional surface rendered models, for implementation in forensic investigation or paleodemographic research. Materials and methods A subset of multi-slice computed tomography (MSCT) DICOM datasets (n=10) of modern Australian subadults (birth—10 years) was accessed from the “Skeletal Biology and Forensic Anthropology Virtual Osteological Database” (n>1200), obtained from retrospective clinical scans taken at Brisbane children hospitals (2009–2013). The capabilities of Geomagic Design X™ form the basis of this study; introducing standardized protocols using triangle surface mesh models to (i) ascertain linear dimensions using reference plane networks and (ii) calculate the area of complex regions of interest on the cranium. Results The protocols described in this paper demonstrate high levels of repeatability between five observers of varying anatomical expertise and software experience. Intra- and inter-observer error was indiscernible with total technical error of measurement (TEM) values ≤0.56 mm, constituting <0.33% relative error (rTEM) for linear measurements; and a TEM value of ≤12.89 mm2, equating to <1.18% (rTEM) of the total area of the anterior fontanelle and contiguous sutures. Conclusions Exploiting the advances of MSCT in routine clinical assessment, this paper assesses the application of this virtual approach to acquire highly reproducible morphometric data in a non-invasive manner for human identification and population studies in growth and development. The protocols and precision testing presented are imperative for the advancement of “virtual anthropology” into routine Australian medico-legal death investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of “the cloud” has attracted significant attention throughout the past few years (Cherry 2009; Sterling and Stark 2009) and, as a result, academics and trade journals have created several competing definitions of “cloud computing” (e.g., Motahari-Nezhad et al. 2009). Underpinning this article is the definition put forward by the US National Institute of Standards and Technology, which describes cloud computing as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Garfinkel 2011, p. 3). Despite the lack of consensus about definitions, however, there is broad agreement on the growing demand for cloud computing. Some estimates suggest that spending on cloudrelated technologies and services in the next few years may climb as high as USD 42 billion/year (Buyya et al. 2009).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this paper is to empirically examine the state of cloud computing adoption in Australia. I specifically focus on the drivers, risks, and benefits of cloud computing from the perspective of IT experts and forensic accountants. I use thematic analysis of interview data to answer the research questions of the study. The findings suggest that cloud computing is increasingly gaining foothold in many sectors due to its advantages such as flexibility and the speed of deployment. However, security remains an issue and therefore its adoption is likely to be selective and phased. Of particular concern are the involvement of third parties and foreign jurisdictions, which in the event of damage may complicate litigation and forensic investigations. This is one of the first empirical studies that reports on cloud computing adoption and experiences in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research introduces a general methodology in order to create a Coloured Petri Net (CPN) model of a security protocol. Then standard or user-defined security properties of the created CPN model are identified. After adding an attacker model to the protocol model, the security property is verified using state space method. This approach is applied to analyse a number of trusted computing protocols. The results show the applicability of proposed method to analyse both standard and user-defined properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of cloud computing services is appealing to the small and medium enterprises (SMEs), with the opportunity to acquire modern information technology resources as a utility and avoid costly capital investments in technology resources. However, the adoption of the cloud computing services presents significant challenges to the SMEs. The SMEs need to determine a path to adopting the cloud computing services that would ensure their sustainable presence in the cloud computing environment. Information about approaches to adopting the cloud computing services by the SMEs is fragmented. Through an interpretive design, we suggest that the SMEs need to have a strategic and incremental intent, understand their organizational structure, understand the external factors, consider the human resource capacity, and understand the value expectations from the cloud computing services to forge a successful path to adopting the cloud computing services. These factors would contribute to a model of cloud services for SMEs.