939 resultados para CROWN
The ratio of VEGF/PEDF expression in bone marrow mesenchymal stem cells regulates neovascularization
Resumo:
Angiogenesis, or neovascularization, is a finely balanced process controlled by pro- and anti-angiogenic factors. Vascular endothelial growth factor (VEGF) is a major pro-angiogenic factor, whereas pigment epithelial-derived factor (PEDF) is the most potent natural angiogenesis inhibitor. In this study, the regulatory role of bone marrow stromal cells (BMSCs) during angiogenesis was assessed by the endothelial differentiation potential, VEGF/PEDF production and responses to pro-angiogenic and hypoxic conditions. The in vivo regulation of blood vessel formation by BMSCs was also explored in a SCID mouse model. Results showed that PEDF was expressed more prominently in BMSCs compared to VEGF. This contrasted with human umbilical vein endothelial cells (HUVECs) where the expression of VEGF was higher than that of PEDF. The ratio of VEGF/PEDF gene expression in BMSCs increased when VEGF concentration reached 40 ng/ml in the culture medium, but decreased at 80 ng/ml. Under CoCl2- induced hypoxic conditions, the VEGF/PEDF ratio of BMSCs increased significantly in both normal and angiogenic culture media. There was no expression of endothelial cell markers in BMSCs cultured in either pro-angiogenic or hypoxia culture conditions when compared with HUVECs. The in vivo study showed that VEGF/PEDF expression closely correlated with the degree of neovascularization, and that hypoxia significantly induced pro-angiogenic activity in BMSCs. These results indicate that, rather than being progenitors of endothelial cells, BMSCs play an important role in regulating the neovascularization process, and that the ratio of VEGF and PEDF may, in effect, be an indicator of the pro- or antiangiogenic activities of BMSCs.
Resumo:
If copyright law does not liberate us from restrictions on the dissemination of knowledge, if it does not encourage expressive freedom, what is its purpose? This volume offers the thinking and suggestions of some of the finest minds grappling with the future of copyright regulation. The Copyright Future Copyright Freedom conference held in 2009 at Old Parliament House Canberra brought together Lawrence Lessig, Julie Cohen, Leslie Zines, Adrian Sterling, Sam Ricketson, Graham Greenleaf, Anne Fitzgerald, Susy Frankel, John Gilchrist, Michael Kirby and others to share the rich fruits of their experience and analysis. Zines, Sterling and Gilchrist outline their roles in the genesis and early growth of Australian copyright legislation, enriching the knowledge of anyone asking urgent questions about the future of information regulation.
Resumo:
When an organisation becomes aware that one of its products may pose a safety risk to customers, it must take appropriate action as soon as possible or it can be held liable. The ability to automatically trace potentially dangerous goods through the supply chain would thus help organisations fulfill their legal obligations in a timely and effective manner. Furthermore, product recall legislation requires manufacturers to separately notify various government agencies, the health department and the public about recall incidents. This duplication of effort and paperwork can introduce errors and data inconsistencies. In this paper, we examine traceability and notification requirements in the product recall domain from two perspectives: the activities carried out during the manufacturing and recall processes and the data collected during the enactment of these processes. We then propose a workflow-based coordination framework to support these data and process requirements.
Resumo:
There has been an increasing interest by governments worldwide in the potential benefits of open access to public sector information (PSI). However, an important question remains: can a government incur tortious liability for incorrect information released online under an open content licence? This paper argues that the release of PSI online for free under an open content licence, specifically a Creative Commons licence, is within the bounds of an acceptable level of risk to government, especially where users are informed of the limitations of the data and appropriate information management policies and principles are in place to ensure accountability for data quality and accuracy.
Resumo:
In the networked information driven world that we now inhabit the ability to access and reuse information, data and culture is a key ingredient to social, economic and cultural innovation. As government holds enormous amounts of publicly funded material that can be released to the public without breaching the law it should move to implement policies that will allow better access to and reuse of that information, knowledge and culture. The Queensland Government Information Licensing Framework (GILF) Project4 is one of the first projects in the world to systemically approach this issue and should be consulted as a best practice model.
Resumo:
The use of appropriate features to characterise an output class or object is critical for all classification problems. In order to find optimal feature descriptors for vegetation species classification in a power line corridor monitoring application, this article evaluates the capability of several spectral and texture features. A new idea of spectral–texture feature descriptor is proposed by incorporating spectral vegetation indices in statistical moment features. The proposed method is evaluated against several classic texture feature descriptors. Object-based classification method is used and a support vector machine is employed as the benchmark classifier. Individual tree crowns are first detected and segmented from aerial images and different feature vectors are extracted to represent each tree crown. The experimental results showed that the proposed spectral moment features outperform or can at least compare with the state-of-the-art texture descriptors in terms of classification accuracy. A comprehensive quantitative evaluation using receiver operating characteristic space analysis further demonstrates the strength of the proposed feature descriptors.
Resumo:
Ethernet is a key component of the standards used for digital process buses in transmission substations, namely IEC 61850 and IEEE Std 1588-2008 (PTPv2). These standards use multicast Ethernet frames that can be processed by more than one device. This presents some significant engineering challenges when implementing a sampled value process bus due to the large amount of network traffic. A system of network traffic segregation using a combination of Virtual LAN (VLAN) and multicast address filtering using managed Ethernet switches is presented. This includes VLAN prioritisation of traffic classes such as the IEC 61850 protocols GOOSE, MMS and sampled values (SV), and other protocols like PTPv2. Multicast address filtering is used to limit SV/GOOSE traffic to defined subsets of subscribers. A method to map substation plant reference designations to multicast address ranges is proposed that enables engineers to determine the type of traffic and location of the source by inspecting the destination address. This method and the proposed filtering strategy simplifies future changes to the prioritisation of network traffic, and is applicable to both process bus and station bus applications.
Resumo:
Photochemistry has made significant contributions to our understanding of many important natural processes as well as the scientific discoveries of the man-made world. The measurements from such studies are often complex and may require advanced data interpretation with the use of multivariate or chemometrics methods. In general, such methods have been applied successfully for data display, classification, multivariate curve resolution and prediction in analytical chemistry, environmental chemistry, engineering, medical research and industry. However, in photochemistry, by comparison, applications of such multivariate approaches were found to be less frequent although a variety of methods have been used, especially with spectroscopic photochemical applications. The methods include Principal Component Analysis (PCA; data display), Partial Least Squares (PLS; prediction), Artificial Neural Networks (ANN; prediction) and several models for multivariate curve resolution related to Parallel Factor Analysis (PARAFAC; decomposition of complex responses). Applications of such methods are discussed in this overview and typical examples include photodegradation of herbicides, prediction of antibiotics in human fluids (fluorescence spectroscopy), non-destructive in- and on-line monitoring (near infrared spectroscopy) and fast-time resolution of spectroscopic signals from photochemical reactions. It is also quite clear from the literature that the scope of spectroscopic photochemistry was enhanced by the application of chemometrics. To highlight and encourage further applications of chemometrics in photochemistry, several additional chemometrics approaches are discussed using data collected by the authors. The use of a PCA biplot is illustrated with an analysis of a matrix containing data on the performance of photocatalysts developed for water splitting and hydrogen production. In addition, the applications of the Multi-Criteria Decision Making (MCDM) ranking methods and Fuzzy Clustering are demonstrated with an analysis of water quality data matrix. Other examples of topics include the application of simultaneous kinetic spectroscopic methods for prediction of pesticides, and the use of response fingerprinting approach for classification of medicinal preparations. In general, the overview endeavours to emphasise the advantages of chemometrics' interpretation of multivariate photochemical data, and an Appendix of references and summaries of common and less usual chemometrics methods noted in this work, is provided. Crown Copyright © 2010.
Resumo:
Empathy is an important pro-social behaviour critical to a positive clientetherapist relationship. Therapist anxiety has been linked to reduced ability to empathise and lower client satisfaction with therapy. However, the nature of the relationship between anxiety and empathy is currently unclear. The current study investigated the effect of experimentally-induced anxiety on empathic responses elicited during three different perspective-taking tasks. Perspective-taking was manipulated within-subjects with all participants (N¼ 52) completing imagine-self, imagine-other and objective conditions. A threat of shock manipulation was used to vary anxiety between-subjects. Participants in the threat of shock condition reported higher levels of anxiety during the experiment and lower levels of empathyrelated distress for the targets than participants in the control condition. Perspective-taking was associated with higher levels of empathy-related distress and concern compared to the objective condition. The present results suggest that perspective-taking can to a large extent mitigate the influence of heightened anxiety on an individual’s ability to empathise.
Resumo:
Road dust contain potentially toxic pollutants originating from a range of anthropogenic sources common to urban land uses and soil inputs from surrounding areas. The research study analysed the mineralogy and morphology of dust samples from road surfaces from different land uses and background soil samples to characterise the relative source contributions to road dust. The road dust consist primarily of soil derived minerals (60%) with quartz averaging 40-50% and remainder being clay forming minerals of albite, microcline, chlorite and muscovite originating from surrounding soils. About 2% was organic matter primarily originating from plant matter. Potentially toxic pollutants represented about 30% of the build-up. These pollutants consist of brake and tire wear, combustion emissions and fly ash from asphalt. Heavy metals such as Zn, Cu, Pb, Ni, Cr and Cd primarily originate from vehicular traffic while Fe, Al and Mn primarily originate from surrounding soils. The research study confirmed the significant contribution of vehicular traffic to dust deposited on urban road surfaces.
Resumo:
The opening phrase of the title is from Charles Darwin’s notebooks (Schweber 1977). It is a double reminder, firstly that mainstream evolutionary theory is not just about describing nature but is particularly looking for mechanisms or ‘causes’, and secondly, that there will usually be several causes affecting any particular outcome. The second part of the title is our concern at the almost universal rejection of the idea that biological mechanisms are sufficient for macroevolutionary changes, thus rejecting a cornerstone of Darwinian evolutionary theory. Our primary aim here is to consider ways of making it easier to develop and to test hypotheses about evolution. Formalizing hypotheses can help generate tests. In an absolute sense, some of the discussion by scientists about evolution is little better than the lack of reasoning used by those advocating intelligent design. Our discussion here is in a Popperian framework where science is defined by that area of study where it is possible, in principle, to find evidence against hypotheses – they are in principle falsifiable. However, with time, the boundaries of science keep expanding. In the past, some aspects of evolution were outside the current boundaries of falsifiable science, but increasingly new techniques and ideas are expanding the boundaries of science and it is appropriate to re-examine some topics. It often appears that over the last few decades there has been an increasingly strong assumption to look first (and only) for a physical cause. This decision is virtually never formally discussed, just an assumption is made that some physical factor ‘drives’ evolution. It is necessary to examine our assumptions much more carefully. What is meant by physical factors ‘driving’ evolution, or what is an ‘explosive radiation’. Our discussion focuses on two of the six mass extinctions, the fifth being events in the Late Cretaceous, and the sixth starting at least 50,000 years ago (and is ongoing). Cretaceous/Tertiary boundary; the rise of birds and mammals. We have had a long-term interest (Cooper and Penny 1997) in designing tests to help evaluate whether the processes of microevolution are sufficient to explain macroevolution. The real challenge is to formulate hypotheses in a testable way. For example the numbers of lineages of birds and mammals that survive from the Cretaceous to the present is one test. Our first estimate was 22 for birds, and current work is tending to increase this value. This still does not consider lineages that survived into the Tertiary, and then went extinct later. Our initial suggestion was probably too narrow in that it lumped four models from Penny and Phillips (2004) into one model. This reduction is too simplistic in that we need to know about survival and ecological and morphological divergences during the Late Cretaceous, and whether Crown groups of avian or mammalian orders may have existed back into the Cretaceous. More recently (Penny and Phillips 2004) we have formalized hypotheses about dinosaurs and pterosaurs, with the prediction that interactions between mammals (and groundfeeding birds) and dinosaurs would be most likely to affect the smallest dinosaurs, and similarly interactions between birds and pterosaurs would particularly affect the smaller pterosaurs. There is now evidence for both classes of interactions, with the smallest dinosaurs and pterosaurs declining first, as predicted. Thus, testable models are now possible. Mass extinction number six: human impacts. On a broad scale, there is a good correlation between time of human arrival, and increased extinctions (Hurles et al. 2003; Martin 2005; Figure 1). However, it is necessary to distinguish different time scales (Penny 2005) and on a finer scale there are still large numbers of possibilities. In Hurles et al. (2003) we mentioned habitat modification (including the use of Geogenes III July 2006 31 fire), introduced plants and animals (including kiore) in addition to direct predation (the ‘overkill’ hypothesis). We need also to consider prey switching that occurs in early human societies, as evidenced by the results of Wragg (1995) on the middens of different ages on Henderson Island in the Pitcairn group. In addition, the presence of human-wary or humanadapted animals will affect the distribution in the subfossil record. A better understanding of human impacts world-wide, in conjunction with pre-scientific knowledge will make it easier to discuss the issues by removing ‘blame’. While continued spontaneous generation was accepted universally, there was the expectation that animals continued to reappear. New Zealand is one of the very best locations in the world to study many of these issues. Apart from the marine fossil record, some human impact events are extremely recent and the remains less disrupted by time.
Resumo:
Floods are the most common type of disaster globally, responsible for almost 53,000 deaths in the last decade alone (23:1 low- versus high-income countries). This review assessed recent epidemiological evidence on the impacts of floods on human health. Published articles (2004–2011) on the quantitative relationship between floods and health were systematically reviewed. 35 relevant epidemiological studies were identified. Health outcomes were categorized into short- and long-term and were found to depend on the flood characteristics and people's vulnerability. It was found that long-term health effects are currently not well understood. Mortality rates were found to increase by up to 50% in the first year post-flood. After floods, it was found there is an increased risk of disease outbreaks such as hepatitis E, gastrointestinal disease and leptospirosis, particularly in areas with poor hygiene and displaced populations. Psychological distress in survivors (prevalence 8.6% to 53% two years post-flood) can also exacerbate their physical illness. There is a need for effective policies to reduce and prevent flood-related morbidity and mortality. Such steps are contingent upon the improved understanding of potential health impacts of floods. Global trends in urbanization, burden of disease, malnutrition and maternal and child health must be better reflected in flood preparedness and mitigation programs.
Resumo:
In this paper we consider the variable order time fractional diffusion equation. We adopt the Coimbra variable order (VO) time fractional operator, which defines a consistent method for VO differentiation of physical variables. The Coimbra variable order fractional operator also can be viewed as a Caputo-type definition. Although this definition is the most appropriate definition having fundamental characteristics that are desirable for physical modeling, numerical methods for fractional partial differential equations using this definition have not yet appeared in the literature. Here an approximate scheme is first proposed. The stability, convergence and solvability of this numerical scheme are discussed via the technique of Fourier analysis. Numerical examples are provided to show that the numerical method is computationally efficient. Crown Copyright © 2012 Published by Elsevier Inc. All rights reserved.
Resumo:
In this paper, a class of fractional advection–dispersion models (FADMs) is considered. These models include five fractional advection–dispersion models, i.e., the time FADM, the mobile/immobile time FADM with a time Caputo fractional derivative 0 < γ < 1, the space FADM with two sides Riemann–Liouville derivatives, the time–space FADM and the time fractional advection–diffusion-wave model with damping with index 1 < γ < 2. These equations can be used to simulate the regional-scale anomalous dispersion with heavy tails. We propose computationally effective implicit numerical methods for these FADMs. The stability and convergence of the implicit numerical methods are analysed and compared systematically. Finally, some results are given to demonstrate the effectiveness of theoretical analysis.
Resumo:
Multi-term time-fractional differential equations have been used for describing important physical phenomena. However, studies of the multi-term time-fractional partial differential equations with three kinds of nonhomogeneous boundary conditions are still limited. In this paper, a method of separating variables is used to solve the multi-term time-fractional diffusion-wave equation and the multi-term time-fractional diffusion equation in a finite domain. In the two equations, the time-fractional derivative is defined in the Caputo sense. We discuss and derive the analytical solutions of the two equations with three kinds of nonhomogeneous boundary conditions, namely, Dirichlet, Neumann and Robin conditions, respectively.