348 resultados para Ferritin-Reference value
Resumo:
Existing distinctions among macro and micro approaches have been jeopardising the advances of Information Systems (IS) research. Both approaches have been criticized for explaining one level while neglecting the other; thereby, the current situation necessitates the application of multilevel research for revealing the deficiencies. Instead of studying single level (macro or micro), multilevel research entails more than one level of conceptualization and analysis, simultaneously. As the notion of multilevel is borrowed from reference disciplines, there tends to be confusions and inconsistencies within the IS discipline, which hinders the adoption of multilevel research. This paper speaks for the potential value of multilevel research, by investigating the current application status of multilevel research within the IS domain. A content analysis of multilevel research articles from major IS conferences and journals is presented. Analysis results suggest that IS scholars have applied multilevel research to produce high quality work ranging from a variety of topics. However, researchers have not yet been consistently defining “multilevel”, leading to idiosyncratic meanings of multilevel research, most often, in authors’ own interpretations. We argue that a rigorous definition of “multilevel research” needs to be explicated for consistencies in research community.
Resumo:
The term Design Led Innovation is emerging as a fundamental business process, which is rapidly being adopted by large as well as small to medium sized firms. The value that design brings to an organisation is a different way of thinking, of framing situations and possibilities, doing things and tackling problems: essentially a cultural transformation of the way the firm undertakes its business. Being Design Led is increasingly being seen by business as a driver of company growth, allowing firms to provide a strong point of difference to its stakeholders. Achieving this Design Led process, requires strong leadership to enable the organisation to develop a clear vision for top line growth. Specifically, based on deep customer insights and expanded through customer and stakeholder engagements, the outcomes of which are then adopted by all aspects of the business. To achieve this goal, several tools and processes are available, which need to be linked to new organisational capabilities within a business transformation context. The Design Led Innovation Team focuses on embedding tools and processes within an organisation and matching this with design leadership qualities to enable companies to create breakthrough innovation and achieve sustained growth, through ultimately transforming their business model. As all information for these case studies was derived from publicly accessed data, this resource is not intended to be used as reference material, but rather is a learning tool for designers to begin to consider and explore businesses at a strategic level. It is not the results that are key, but rather the process and philosophies that were used to create these case studies and disseminate this way of thinking amongst the design community. It is this process of unpacking a business guided by the framework of Osterwalder’s Business Model Canvas* which provides an important tool for designers to gain a greater perspective of a company’s true innovation potential.
Resumo:
As the world’s population is growing, so is the demand for agricultural products. However, natural nitrogen (N) fixation and phosphorus (P) availability cannot sustain the rising agricultural production, thus, the application of N and P fertilisers as additional nutrient sources is common. It is those anthropogenic activities that can contribute high amounts of organic and inorganic nutrients to both surface and groundwaters resulting in degradation of water quality and a possible reduction of aquatic life. In addition, runoff and sewage from urban and residential areas can contain high amounts of inorganic and organic nutrients which may also affect water quality. For example, blooms of the cyanobacterium Lyngbya majuscula along the coastline of southeast Queensland are an indicator of at least short term decreases of water quality. Although Australian catchments, including those with intensive forms of land use, show in general a low export of nutrients compared to North American and European catchments, certain land use practices may still have a detrimental effect on the coastal environment. Numerous studies are reported on nutrient cycling and associated processes on a catchment scale in the Northern Hemisphere. Comparable studies in Australia, in particular in subtropical regions are, however, limited and there is a paucity in the data, in particular for inorganic and organic forms of nitrogen and phosphorus; these nutrients are important limiting factors in surface waters to promote algal blooms. Therefore, the monitoring of N and P and understanding the sources and pathways of these nutrients within a catchment is important in coastal zone management. Although Australia is the driest continent, in subtropical regions such as southeast Queensland, rainfall patterns have a significant effect on runoff and thus the nutrient cycle at a catchment scale. Increasingly, these rainfall patterns are becoming variable. The monitoring of these climatic conditions and the hydrological response of agricultural catchments is therefore also important to reduce the anthropogenic effects on surface and groundwater quality. This study consists of an integrated hydrological–hydrochemical approach that assesses N and P in an environment with multiple land uses. The main aim is to determine the nutrient cycle within a representative coastal catchment in southeast Queensland, the Elimbah Creek catchment. In particular, the investigation confirms the influence associated with forestry and agriculture on N and P forms, sources, distribution and fate in the surface and groundwaters of this subtropical setting. In addition, the study determines whether N and P are subject to transport into the adjacent estuary and thus into the marine environment; also considered is the effect of local topography, soils and geology on N and P sources and distribution. The thesis is structured on four components individually reported. The first paper determines the controls of catchment settings and processes on stream water, riverbank sediment, and shallow groundwater N and P concentrations, in particular during the extended dry conditions that were encountered during the study. Temporal and spatial factors such as seasonal changes, soil character, land use and catchment morphology are considered as well as their effect on controls over distributions of N and P in surface waters and associated groundwater. A total number of 30 surface and 13 shallow groundwater sampling sites were established throughout the catchment to represent dominant soil types and the land use upstream of each sampling location. Sampling comprises five rounds and was conducted over one year between October 2008 and November 2009. Surface water and groundwater samples were analysed for all major dissolved inorganic forms of N and for total N. Phosphorus was determined in the form of dissolved reactive P (predominantly orthophosphate) and total P. In addition, extracts of stream bank sediments and soil grab samples were analysed for these N and P species. Findings show that major storm events, in particular after long periods of drought conditions, are the driving force of N cycling. This is expressed by higher inorganic N concentrations in the agricultural subcatchment compared to the forested subcatchment. Nitrate N is the dominant inorganic form of N in both the surface and groundwaters and values are significantly higher in the groundwaters. Concentrations in the surface water range from 0.03 to 0.34 mg N L..1; organic N concentrations are considerably higher (average range: 0.33 to 0.85 mg N L..1), in particular in the forested subcatchment. Average NO3-N in the groundwater has a range of 0.39 to 2.08 mg N L..1, and organic N averages between 0.07 and 0.3 mg N L..1. The stream bank sediments are dominated by organic N (range: 0.53 to 0.65 mg N L..1), and the dominant inorganic form of N is NH4-N with values ranging between 0.38 and 0.41 mg N L..1. Topography and soils, however, were not to have a significant effect on N and P concentrations in waters. Detectable phosphorus in the surface and groundwaters of the catchment is limited to several locations typically in the proximity of areas with intensive animal use; in soil and sediments, P is negligible. In the second paper, the stable isotopes of N (14N/15N) and H2O (16O/18O and 2H/H) in surface and groundwaters are used to identify sources of dissolved inorganic and organic N in these waters, and to determine their pathways within the catchment; specific emphasis is placed on the relation of forestry and agriculture. Forestry is predominantly concentrated in the northern subcatchment (Beerburrum Creek) while agriculture is mainly found in the southern subcatchment (Six Mile Creek). Results show that agriculture (horticulture, crops, grazing) is the main source of inorganic N in the surface waters of the agricultural subcatchment, and their isotopic signature shows a close link to evaporation processes that may occur during water storage in farm dams that are used for irrigation. Groundwaters are subject to denitrification processes that may result in reduced dissolved inorganic N concentrations. Soil organic matter delivers most of the inorganic N to the surface water in the forested subcatchment. Here, precipitation and subsequently runoff is the main source of the surface waters. Groundwater in this area is affected by agricultural processes. The findings also show that the catchment can attenuate the effects of anthropogenic land use on surface water quality. Riparian strips of natural remnant vegetation, commonly 50 to 100 m in width, act as buffer zones along the drainage lines in the catchment and remove inorganic N from the soil water before it enters the creek. These riparian buffer zones are common in most agricultural catchments of southeast Queensland and are indicated to reduce the impact of agriculture on stream water quality and subsequently on the estuary and marine environments. This reduction is expressed by a significant decrease in DIN concentrations from 1.6 mg N L..1 to 0.09 mg N L..1, and a decrease in the �15N signatures from upstream surface water locations downstream to the outlet of the agricultural subcatchment. Further testing is, however, necessary to confirm these processes. Most importantly, the amount of N that is transported to the adjacent estuary is shown to be negligible. The third and fourth components of the thesis use a hydrological catchment model approach to determine the water balance of the Elimbah Creek catchment. The model is then used to simulate the effects of land use on the water balance and nutrient loads of the study area. The tool that is used is the internationally widely applied Soil and Water Assessment Tool (SWAT). Knowledge about the water cycle of a catchment is imperative in nutrient studies as processes such as rainfall, surface runoff, soil infiltration and routing of water through the drainage system are the driving forces of the catchment nutrient cycle. Long-term information about discharge volumes of the creeks and rivers do, however, not exist for a number of agricultural catchments in southeast Queensland, and such information is necessary to calibrate and validate numerical models. Therefore, a two-step modelling approach was used to calibrate and validate parameters values from a near-by gauged reference catchment as starting values for the ungauged Elimbah Creek catchment. Transposing monthly calibrated and validated parameter values from the reference catchment to the ungauged catchment significantly improved model performance showing that the hydrological model of the catchment of interest is a strong predictor of the water water balance. The model efficiency coefficient EF shows that 94% of the simulated discharge matches the observed flow whereas only 54% of the observed streamflow was simulated by the SWAT model prior to using the validated values from the reference catchment. In addition, the hydrological model confirmed that total surface runoff contributes the majority of flow to the surface water in the catchment (65%). Only a small proportion of the water in the creek is contributed by total base-flow (35%). This finding supports the results of the stable isotopes 16O/18O and 2H/H, which show the main source of water in the creeks is either from local precipitation or irrigation waters delivered by surface runoff; a contribution from the groundwater (baseflow) to the creeks could not be identified using 16O/18O and 2H/H. In addition, the SWAT model calculated that around 68% of the rainfall occurring in the catchment is lost through evapotranspiration reflecting the prevailing long-term drought conditions that were observed prior and during the study. Stream discharge from the forested subcatchment was an order of magnitude lower than discharge from the agricultural Six Mile Creek subcatchment. A change in land use from forestry to agriculture did not significantly change the catchment water balance, however, nutrient loads increased considerably. Conversely, a simulated change from agriculture to forestry resulted in a significant decrease of nitrogen loads. The findings of the thesis and the approach used are shown to be of value to catchment water quality monitoring on a wider scale, in particular the implications of mixed land use on nutrient forms, distributions and concentrations. The study confirms that in the tropics and subtropics the water balance is affected by extended dry periods and seasonal rainfall with intensive storm events. In particular, the comprehensive data set of inorganic and organic N and P forms in the surface and groundwaters of this subtropical setting acquired during the one year sampling program may be used in similar catchment hydrological studies where these detailed information is missing. Also, the study concludes that riparian buffer zones along the catchment drainage system attenuate the transport of nitrogen from agricultural sources in the surface water. Concentrations of N decreased from upstream to downstream locations and were negligible at the outlet of the catchment.
Resumo:
The late eighteenth century witnessed the emergence of new technologies of subjectivity and of the literary. Most obviously, “the novel as a literary form appeared to embody and turn into an object the experience of life itself” (Park), and the novel genre came to both reflect and shape notions of interiority and subjectivity. In this same period, “A shift was taking place in the way people felt and thought about children and the accoutrements of childhood, including books and toys, were implicated in this change” (Lewis). In seeking to understand the relationships between media (e.g. books and toys), genres (e.g. novels and picture books), and modes of subjectivity, Marx’s influential theory of commodity fetishism, whereby “a definite social relation between men, that assumes, in their eyes, the fantastic form of a relation between things”, has served as a productive tool of analysis. The extent to which Marx’s account of commodity fetishism continues to be of use becomes clear when the corollaries between the late eighteenth-century emergence of novels and pictures books as technologies of subjectivity and the early twenty-first century emergence of e-readers and digital texts as technologies of subjectivity are considered. This paper considers the literary technology of Apple’s iPad (first launched in 2010) as a commodity fetish, and the circulation of “apps” as texts made available by and offered as justifications for, this fetish object. The iPad is both book and toy, but is never “only” either; it is arguably a new technology of subjectivity which incorporates but also destabilises categories of reading and playing such as those made familiar by earlier technologies of literature and the self. The particular focus of this paper is on the multimodal versions (app, film, and picture book) of The Fantastic Flying Books of Mr. Morris Lessmore, which are understood here as a narrativisation of commodity fetishism, subjectivity, and the act of reading itself.
Resumo:
One remaining difficulty in the Information Technology (IT) business value evaluation domain is the direct linkage between IT value and the underlying determinants of IT value or surrogates of IT value. This paper proposes a research that examines the interacting effects of the determinants of IT value, and their influences on IT value. The overarching research question is how those determinants interact with each other and affect the IT value at organizational value. To achieve this, this research embraces a multilevel, complex, and adaptive system view, where the IT value emerges from the interacting of underlying determinants. This research is theoretically grounded on three organizational theories – multilevel theory, complex adaptive systems theory, and adaptive structuration theory. By integrating those theoretical paradigms, this research proposes a conceptual model that focuses on the process where IT value is created from interactions of those determinants. To answer the research question, agent-based modeling technique is used in this research to build a computational representation based on the conceptual model. Computational experimentation will be conducted based on the computational representation. Validation procedures will be applied to consolidate the validity of this model. In the end, hypotheses will be tested using computational experimentation data.
Resumo:
Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.
Resumo:
This paper investigates the relationship between US MNCs' valuations and anti-Americanism in countries where MNCs' foreign subsidiaries are located. We find that MNCs suffer value-destruction when they enter markets where people express severe anti-Americanism. However, we uncover that geographic diversification into these high anti-Americanism countries significantly increases firm value if the MNC has high levels of intangibles such as technological know-how and marketing expertise. Our findings are consistent with the notion that the advantages from internalizing the cross-border transfer of intangibles are greater when barriers to competition are higher.
Resumo:
Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.
Resumo:
Over the past few years many organizations that directly or indirectly interact with consumers have invested heavily into a social media presence. As a consequence some success indicators are openly available to users of many social media platforms, such as the number of fans (or followers, members, visitors and others) or the amount of content(tweets, images, shares or other content). Many organizations additionally track their social activities internally to understand audience reach, consumer influence, brand image, consumer preference or other key metrics that make sense for a business. However, most of the immediately available social media success metrics are activity-based and many organizations are struggling with establishing a direct relationship to business success. This paper systematically reviews some of the common social media metrics/ratings used by organisations, critically analyse its business value and identify gaps formulating research questions for empirical study and concluding with recommendations and suggestions for future research.
Resumo:
How can we reach out to institutions, artists and audiences with sometimes radically different agendas to encourage them to see, participate in and support the development of new practices and programs in the performing arts? In this paper, based on a plenary panel at PSi#18 Performance Culture Industry at the University of Leeds, Clarissa Ruiz (Columbia), AnuradhaKapur (India) and Sheena Wrigley (England) together with interloctorBree Hadley (Australia) speak about their work in as policy-makers, managers and producers in the performing arts in Europe, Asia and America over the past several decades. Acknowledged trailblazers in their fields, Ruiz, Kapur and Wrigley all have a commitment to creating a vital, viable and sustainable performing arts ecologies. Each has extensive experience in performance, politics, and the challenging process of managing histories, visions, stakeholders, and sometimes scarce resources to generate lasting benefits for the various communities have worked for, with and within. Their work, cultivating new initiatives, programs or policy has made them expert at brokering relationships in and in between private, public and political spheres to elevate the status of and support for performing arts as a socially and economically beneficial activity everyone can participate in. Each gives examples from their own practice to provide insight into how to negotiate the interests of artistic, government, corporate, community and education partners, and the interests of audiences, to create aesthetic, cultural and / or economic value. Together, their views offer a compelling set of perspectives on the changing meanings of the ‘value of the arts’ and the effects this has had for the artists that make and arts organisations that produce and present work in a range of different regional, national and cross-national contexts.
Resumo:
After attending this presentation, attendees will gain awareness of: (1) the error and uncertainty associated with the application of the Suchey-Brooks (S-B) method of age estimation of the pubic symphysis to a contemporary Australian population; (2) the implications of sexual dimorphism and bilateral asymmetry of the pubic symphysis through preliminary geometric morphometric assessment; and (3) the value of three-dimensional (3D) autopsy data acquisition for creating forensic anthropological standards. This presentation will impact the forensic science community by demonstrating that, in the absence of demographically sound skeletal collections, post-mortem autopsy data provides an exciting platform for the construction of large contemporary ‘virtual osteological libraries’ for which forensic anthropological research can be conducted on Australian individuals. More specifically, this study assesses the applicability and accuracy of the S-B method to a contemporary adult population in Queensland, Australia, and using a geometric morphometric approach, provides an insight to the age-related degeneration of the pubic symphysis. Despite the prominent use of the Suchey-Brooks (1990) method of age estimation in forensic anthropological practice, it is subject to intrinsic limitations, with reports of differential inter-population error rates between geographical locations1-4. Australian forensic anthropology is constrained by a paucity of population specific standards due to a lack of repositories of documented skeletons. Consequently, in Australian casework proceedings, standards constructed from predominately American reference samples are applied to establish a biological profile. In the global era of terrorism and natural disasters, more specific population standards are required to improve the efficiency of medico-legal death investigation in Queensland. The sample comprises multi-slice computed tomography (MSCT) scans of the pubic symphysis (slice thickness: 0.5mm, overlap: 0.1mm) on 195 individuals of caucasian ethnicity aged 15-70 years. Volume rendering reconstruction of the symphyseal surface was conducted in Amira® (v.4.1) and quantitative analyses in Rapidform® XOS. The sample was divided into ten-year age sub-sets (eg. 15-24) with a final sub-set of 65-70 years. Error with respect to the method’s assigned means were analysed on the basis of bias (directionality of error), inaccuracy (magnitude of error) and percentage correct classification of left and right symphyseal surfaces. Morphometric variables including surface area, circumference, maximum height and width of the symphyseal surface and micro-architectural assessment of cortical and trabecular bone composition were quantified using novel automated engineering software capabilities. The results of this study demonstrated correct age classification utilizing the mean and standard deviations of each phase of the S-B method of 80.02% and 86.18% in Australian males and females, respectively. Application of the S-B method resulted in positive biases and mean inaccuracies of 7.24 (±6.56) years for individuals less than 55 years of age, compared to negative biases and mean inaccuracies of 5.89 (±3.90) years for individuals greater than 55 years of age. Statistically significant differences between chronological and S-B mean age were demonstrated in 83.33% and 50% of the six age subsets in males and females, respectively. Asymmetry of the pubic symphysis was a frequent phenomenon with 53.33% of the Queensland population exhibiting statistically significant (χ2 - p<0.01) differential phase classification of left and right surfaces of the same individual. Directionality was found in bilateral asymmetry, with the right symphyseal faces being slightly older on average and providing more accurate estimates using the S-B method5. Morphometric analysis verified these findings, with the left surface exhibiting significantly greater circumference and surface area than the right (p<0.05). Morphometric analysis demonstrated an increase in maximum height and width of the surface with age, with most significant changes (p<0.05) occurring between the 25-34 and 55-64 year age subsets. These differences may be attributed to hormonal components linked to menopause in females and a reduction in testosterone in males. Micro-architectural analysis demonstrated degradation of cortical composition with age, with differential bone resorption between the medial, ventral and dorsal surfaces of the pubic symphysis. This study recommends that the S-B method be applied with caution in medico-legal death investigations of unknown skeletal remains in Queensland. Age estimation will always be accompanied by error; therefore this study demonstrates the potential for quantitative morphometric modelling of age related changes of the pubic symphysis as a tool for methodological refinement, providing a rigor and robust assessment to remove the subjectivity associated with current pelvic aging methods.
Resumo:
Osteocyte cells are the most abundant cells in human bone tissue. Due to their unique morphology and location, osteocyte cells are thought to act as regulators in the bone remodelling process, and are believed to play an important role in astronauts’ bone mass loss after long-term space missions. There is increasing evidence showing that an osteocyte’s functions are highly affected by its morphology. However, changes in an osteocyte’s morphology under an altered gravity environment are still not well documented. Several in vitro studies have been recently conducted to investigate the morphological response of osteocyte cells to the microgravity environment, where osteocyte cells were cultured on a two-dimensional flat surface for at least 24 hours before microgravity experiments. Morphology changes of osteocyte cells in microgravity were then studied by comparing the cell area to 1g control cells. However, osteocyte cells found in vivo are with a more 3D morphology, and both cell body and dendritic processes are found sensitive to mechanical loadings. A round shape osteocyte’s cells support a less stiff cytoskeleton and are more sensitive to mechanical stimulations compared with flat cellular morphology. Thus, the relative flat and spread shape of isolated osteocytes in 2D culture may greatly hamper their sensitivity to a mechanical stimulus, and the lack of knowledge on the osteocyte’s morphological characteristics in culture may lead to subjective and noncomprehensive conclusions of how altered gravity impacts on an osteocyte’s morphology. Through this work empirical models were developed to quantitatively predicate the changes of morphology in osteocyte cell lines (MLO-Y4) in culture, and the response of osteocyte cells, which are relatively round in shape, to hyper-gravity stimulation has also been investigated. The morphology changes of MLO-Y4 cells in culture were quantified by measuring cell area and three dimensionless shape features including aspect ratio, circularity and solidity by using widely accepted image analysis software (ImageJTM). MLO-Y4 cells were cultured at low density (5×103 per well) and the changes in morphology were recorded over 10 hours. Based on the data obtained from the imaging analysis, empirical models were developed using the non-linear regression method. The developed empirical models accurately predict the morphology of MLO-Y4 cells for different culture times and can, therefore, be used as a reference model for analysing MLO-Y4 cell morphology changes within various biological/mechanical studies, as necessary. The morphological response of MLO-Y4 cells with a relatively round morphology to hyper-gravity environment has been investigated using a centrifuge. After 2 hours culture, MLO-Y4 cells were exposed to 20g for 30mins. Changes in the morphology of MLO-Y4 cells are quantitatively analysed by measuring the average value of cell area and dimensionless shape factors such as aspect ratio, solidity and circularity. In this study, no significant morphology changes were detected in MLO-Y4 cells under a hyper-gravity environment (20g for 30 mins) compared with 1g control cells.
Resumo:
Peer review is a reflective process which allows us to formalise, and gain maximum benefit from, collegial feedback on our professional performance. It is also a process that encourages us to engage in cycles of planning, acting, recording and reflection which are familiar components of action learning and action research. Entering into these cycles within the peer-review framework is a powerful and cost-effective means of facilitating professional development which is readily adapted to the library context. In 1996, a project implementing peer review, in order to improve client interaction at the reference desk, was completed at the University of Southern Queensland (USQ) Library. For that project we developed a set of guidelines for library staff involved in peer review. These guidelines explained the value of peer review, and described its principles and purposes. We also devised strategies to assist staff as they prepared for the experience of peer review, engaged in the process and reflected on the outcomes. A number of benefits were identified; the peer-review process enhanced team spirit, enhanced client-orientation, and fostered collaborative efforts in improving the reference service. It was also relatively inexpensive to implement. In this paper we will discuss the nature of peer review and its importance to library and information professionals. We will also share the guidelines we developed, and discuss the implementation and outcomes of the peer review project at the University of Southern Queensland. We will conclude by discussing the benefits perceived and the issues that arose in the USQ context, and by suggesting a range of other aspects of library service in which peer-review could be implemented.
Resumo:
The production of adequate agricultural outputs to support the growing human population places great demands on agriculture, especially in light of ever-greater restrictions on input resources. Sorghum is a drought-adapted cereal capable of reliable production where other cereals fail, and thus represents a good candidate to address food security as agricultural inputs of water and arable land grow scarce. A long-standing issue with sorghum grain is that it has an inherently lower digestibility. Here we show that a low-frequency allele type in the starch metabolic gene, pullulanase, is associated with increased digestibility, regardless of genotypic background. We also provide evidence that the beneficial allele type is not associated with deleterious pleiotropic effects in the modern field environment. We argue that increasing the digestibility of an adapted crop is a viable way forward towards addressing food security while maximizing water and land-use efficiency.
Resumo:
It is apparent that IT resources are important for organisations. It is also clear that organisations unique competencies, their IT-related capabilities, leverage the IT resources uniquely to create and sustain competitive advantage. However, IT resources are dynamic, and evolve at an exponential rate. This means that organisations will need to sustain their competencies to leverage opportunities offered by new IT resources. Research on ways to sustain IT-related capabilities is limited and a deeper understanding of this situation is important. Amongst other factors, a possible reason for this lack of progress in this area could be due to the lack of validated measurement items of the theoretical constructs to conduct such studies. We suggest an environment in which organisations could build new and sustain their existing IT-related capabilities. We then report on the development of valid and reliable measures for this environment. The validated measures would be useful in extending our understanding on how firms could sustain their IT-related capabilities. This effort will provide a deeper understanding of how firms can secure sustainable IT-related business value from their acquired IT resources.