992 resultados para Meyer, Marvin
Resumo:
The purpose of this paper is to identify goal conflicts – both actual and potential – between climate and social policies in government strategies in response to the growing significance of climate change as a socioecological issue (IPCC 2007). Both social and climate policies are political responses to long-term societal trends related to capitalist development, industrialisation, and urbanisation (Koch, 2012). Both modify these processes through regulation, fiscal transfers and other measures, thereby affecting conditions for the other. This means that there are fields of tensions and synergies between social policy and climate change policy. Exploring these tensions and synergies is an increasingly important task for navigating genuinely sustainable development. Gough et al (2008) highlight three potential synergies between social and climate change policies: First, income redistribution – a traditional concern of social policy – can facilitate use of and enhance efficiency of carbon pricing. A second area of synergy is housing, transport, urban policies and community development, which all have potential to crucially contribute towards reducing carbon emissions. Finally, climate change mitigation will require substantial and rapid shifts in producer and consumer behaviour. Land use planning policy is a critical bridge between climate change and social policy that provides a means to explore the tensions and synergies that are evolving within this context. This paper will focus on spatial planning as an opportunity to develop strategies to adapt to climate change, and reviews the challenges of such change. Land use and spatial planning involve the allocation of land and the design and control of spatial patterns. Spatial planning is identified as being one of the most effective means of adapting settlements in response to climate change (Hurlimann and March, 2012). It provides the instrumental framework for adaptation (Meyer, et al., 2010) and operates as both a mechanism to achieve adaptation and a forum to negotiate priorities surrounding adaptation (Davoudi, et al., 2009). The acknowledged role of spatial planning in adaptation however has not translated into comparably significant consideration in planning literature (Davoudi, et al., 2009; Hurlimann and March, 2012). The discourse on adaptation specifically through spatial planning is described as ‘missing’ and ‘subordinate’ in national adaptation plans (Greiving and Fleischhauer, 2012),‘underrepresented’ (Roggema, et al., 2012)and ‘limited and disparate’ in planning literature (Davoudi, et al., 2009). Hurlimann and March (2012) suggest this may be due to limited experiences of adaptation in developed nations while Roggema et al. (2012) and Crane and Landis (2010) suggest it is because climate change is a wicked problem involving an unfamiliar problem, various frames of understanding and uncertain solutions. The potential for goal conflicts within this policy forum seem to outweigh the synergies. Yet, spatial planning will be a critical policy tool in the future to both protect and adapt communities to climate change.
Resumo:
Recent experiments [F. E. Pinkerton, M. S. Meyer, G. P. Meisner, M. P. Balogh, and J. J. Vajo, J. Phys. Chem. C 111, 12881 (2007) and J. J. Vajo and G. L. Olson, Scripta Mater. 56, 829 (2007)] demonstrated that the recycling of hydrogen in the coupled LiBH4/MgH2 system is fully reversible. The rehydrogenation of MgB2 is an important step toward the reversibility. By using ab initio density functional theory calculations, we found that the activation barrier for the dissociation of H2 are 0.49 and 0.58 eV for the B and Mg-terminated MgB2(0001) surface, respectively. This implies that the dissociation kinetics of H2 on a MgB2 (0001) surface should be greatly improved compared to that in pure Mg materials. Additionally, the diffusion of dissociated H atom on the Mg-terminated MgB2(0001) surface is almost barrier-less. Our results shed light on the experimentally-observed reversibility and improved kinetics for the coupled LiBH4/MgH2 system.
Resumo:
We conducted a large-scale association study to identify genes that influence nonfamilial breast cancer risk using a collection of German cases and matched controls and >25,000 single nucleotide polymorphisms located within 16,000 genes. One of the candidate loci identified was located on chromosome 19p13.2 [odds ratio (OR) = 1.5, P = 0.001]. The effect was substantially stronger in the subset of cases with reported family history of breast cancer (OR = 3.4, P = 0.001). The finding was subsequently replicated in two independent collections (combined OR = 1.4, P < 0.001) and was also associated with predisposition to prostate cancer in an independent sample set of prostate cancer cases and matched controls (OR = 1.4, P = 0.002). High-density single nucleotide polymorphism mapping showed that the extent of association spans 20 kb and includes the intercellular adhesion molecule genes ICAM1, ICAM4, and ICAM5. Although genetic variants in ICAM5 showed the strongest association with disease status, ICAM1 is expressed at highest levels in normal and tumor breast tissue. A variant in ICAM5 was also associated with disease progression and prognosis. Because ICAMs are suitable targets for antibodies and small molecules, these findings may not only provide diagnostic and prognostic markers but also new therapeutic opportunities in breast and prostate cancer.
Resumo:
Crashes on motorway contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence reduce crashes will help address congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a Short time window around the time of crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques, that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists, and that this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with traffic flow data of one hour prior to the crash using an incident detection algorithm. Traffic flow trends (traffic speed/occupancy time series) revealed that crashes could be clustered with regards of the dominant traffic flow pattern prior to the crash. Using the k-means clustering method allowed the crashes to be clustered based on their flow trends rather than their distance. Four major trends have been found in the clustering results. Based on these findings, crash likelihood estimation algorithms can be fine-tuned based on the monitored traffic flow conditions with a sliding window of 60 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence, reducing the frequency of crashes assists in addressing congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a short time window around the time of a crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists. We will compare them with normal traffic trends and show this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding to traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash. Using the K-Means clustering method with Euclidean distance function allowed the crashes to be clustered. Then, normal situation data was extracted based on the time distribution of crashes and were clustered to compare with the “high risk” clusters. Five major trends have been found in the clustering results for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Based on these findings, crash likelihood estimation models can be fine-tuned based on the monitored traffic conditions with a sliding window of 30 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
Expert searchers engage with information as information brokers, researchers, reference librarians, information architects, faculty who teach advanced search, and in a variety of other information-intensive professions. Their experiences are characterized by a profound understanding of information concepts and skills and they have an agile ability to apply this knowledge to interacting with and having an impact on the information environment. This study explored the learning experiences of searchers to understand the acquisition of search expertise. The research question was: What can be learned about becoming an expert searcher from the learning experiences of proficient novice searchers and highly experienced searchers? The key objectives were: (1) to explore the existence of threshold concepts in search expertise; (2) to improve our understanding of how search expertise is acquired and how novice searchers, intent on becoming experts, can learn to search in more expertlike ways. The participant sample drew from two population groups: (1) highly experienced searchers with a minimum of 20 years of relevant professional experience, including LIS faculty who teach advanced search, information brokers, and search engine developers (11 subjects); and (2) MLIS students who had completed coursework in information retrieval and online searching and demonstrated exceptional ability (9 subjects). Using these two groups allowed a nuanced understanding of the experience of learning to search in expertlike ways, with data from those who search at a very high level as well as those who may be actively developing expertise. The study used semi-structured interviews, search tasks with think-aloud narratives, and talk-after protocols. Searches were screen-captured with simultaneous audio-recording of the think-aloud narrative. Data were coded and analyzed using NVivo9 and manually. Grounded theory allowed categories and themes to emerge from the data. Categories represented conceptual knowledge and attributes of expert searchers. In accord with grounded theory method, once theoretical saturation was achieved, during the final stage of analysis the data were viewed through lenses of existing theoretical frameworks. For this study, threshold concept theory (Meyer & Land, 2003) was used to explore which concepts might be threshold concepts. Threshold concepts have been used to explore transformative learning portals in subjects ranging from economics to mathematics. A threshold concept has five defining characteristics: transformative (causing a shift in perception), irreversible (unlikely to be forgotten), integrative (unifying separate concepts), troublesome (initially counter-intuitive), and may be bounded. Themes that emerged provided evidence of four concepts which had the characteristics of threshold concepts. These were: information environment: the total information environment is perceived and understood; information structures: content, index structures, and retrieval algorithms are understood; information vocabularies: fluency in search behaviors related to language, including natural language, controlled vocabulary, and finesse using proximity, truncation, and other language-based tools. The fourth threshold concept was concept fusion, the integration of the other three threshold concepts and further defined by three properties: visioning (anticipating next moves), being light on one's 'search feet' (dancing property), and profound ontological shift (identity as searcher). In addition to the threshold concepts, findings were reported that were not concept-based, including praxes and traits of expert searchers. A model of search expertise is proposed with the four threshold concepts at its core that also integrates the traits and praxes elicited from the study, attributes which are likewise long recognized in LIS research as present in professional searchers. The research provides a deeper understanding of the transformative learning experiences involved in the acquisition of search expertise. It adds to our understanding of search expertise in the context of today's information environment and has implications for teaching advanced search, for research more broadly within library and information science, and for methodologies used to explore threshold concepts.
Resumo:
Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestion. Hence, reducing the frequency of crashes assist in addressing congestion issues (Meyer, 2008). Analysing traffic conditions and discovering risky traffic trends and patterns are essential basics in crash likelihood estimations studies and still require more attention and investigation. In this paper we will show, through data mining techniques, that there is a relationship between pre-crash traffic flow patterns and crash occurrence on motorways, compare them with normal traffic trends, and that this knowledge has the potentiality to improve the accuracy of existing crash likelihood estimation models, and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash occurrence. K-Means clustering algorithm applied to determine dominant pre-crash traffic patterns. In the first phase of this research, traffic regimes identified by analysing crashes and normal traffic situations using half an hour speed in upstream locations of crashes. Then, the second phase investigated the different combination of speed risk indicators to distinguish crashes from normal traffic situations more precisely. Five major trends have been found in the first phase of this paper for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Moreover, the second phase explains that spatiotemporal difference of speed is a better risk indicator among different combinations of speed related risk indicators. Based on these findings, crash likelihood estimation models can be fine-tuned to increase accuracy of estimations and minimize false alarms.
Resumo:
This project develops and evaluates a model of curriculum design that aims to assist student learning of foundational disciplinary ‘Threshold Concepts’. The project uses phenomenographic action research, cross-institutional peer collaboration and the Variation Theory of Learning to develop and trial the model. Two contrasting disciplines (Physics and Law) and four institutions (two research-intensive and two universities of technology) were involved in the project, to ensure broad applicability of the model across different disciplines and contexts. The Threshold Concepts that were selected for curriculum design attention were measurement uncertainty in Physics and legal reasoning in Law. Threshold Concepts are key disciplinary concepts that are inherently troublesome, transformative and integrative in nature. Once understood, such concepts transform students’ views of the discipline because they enable students to coherently integrate what were previously seen as unrelated aspects of the subject, providing new ways of thinking about it (Meyer & Land 2003, 2005, 2006; Land et al. 2008). However, the integrative and transformative nature of such threshold concepts make them inherently difficult for students to learn, with resulting misunderstandings of concepts being prevalent...
Resumo:
Process models specify behavioral aspects by describing ordering constraints between tasks which must be accomplished to achieve envisioned goals. Tasks usually exchange information by means of data objects, i.e., by writing information to and reading information from data objects. A data object can be characterized by its states and allowed state transitions. In this paper, we propose a notion which checks conformance of a process model with respect to data objects that its tasks access. This new notion can be used to tell whether in every execution of a process model each time a task needs to access a data object in a particular state, it is ensured that the data object is in the expected state or can reach the expected state and, hence, the process model can achieve its goals.
Resumo:
Background Accelerometers have become one of the most common methods of measuring physical activity (PA). Thus, validity of accelerometer data reduction approaches remains an important research area. Yet, few studies directly compare data reduction approaches and other PA measures in free-living samples. Objective To compare PA estimates provided by 3 accelerometer data reduction approaches, steps, and 2 self-reported estimates: Crouter's 2-regression model, Crouter's refined 2-regression model, the weighted cut-point method adopted in the National Health and Nutrition Examination Survey (NHANES; 2003-2004 and 2005-2006 cycles), steps, IPAQ, and 7-day PA recall. Methods A worksite sample (N = 87) completed online-surveys and wore ActiGraph GT1M accelerometers and pedometers (SW-200) during waking hours for 7 consecutive days. Daily time spent in sedentary, light, moderate, and vigorous intensity activity and percentage of participants meeting PA recommendations were calculated and compared. Results Crouter's 2-regression (161.8 +/- 52.3 minutes/day) and refined 2-regression (137.6 +/- 40.3 minutes/day) models provided significantly higher estimates of moderate and vigorous PA and proportions of those meeting PA recommendations (91% and 92%, respectively) as compared with the NHANES weighted cut-point method (39.5 +/- 20.2 minutes/day, 18%). Differences between other measures were also significant. Conclusions When comparing 3 accelerometer cut-point methods, steps, and self-report measures, estimates of PA participation vary substantially.
Resumo:
Both the integrin and insulin-like growth factor binding protein (IGFBP) families independently play important roles in modulating tumor cell growth and progression. We present evidence for a specific cell surface localization and a bimolecular interaction between the αvβ3 integrin and IGFBP-2. The interaction, which could be specifically perturbed using vitronectin and αvβ3 blocking antibodies, was shown to modulate IGF-mediated cellular migration responses. Moreover, this interaction was observed in vivo and correlated with reduced tumor size of the human breast cancer cells, MCF-7β3, which overexpressed the αvβ3 integrin. Collectively, these results indicate that αvβ3 and IGFBP-2 act cooperatively in a negative regulatory manner to reduce tumor growth and the migratory potential of breast cancer cells.
Resumo:
Transportation construction is substantially different from other construction fields due to widespread use of unit price bidding and competitive contract awarding. Thus, the potential for change orders has been the main source of unbalanced bidding for contractors, which can be described as substantial increases in work quantity or reasonable changes to the initial design provided by the State Highway Agencies (SHAs). It is important to understand the causes of the change orders as cost related issues are the main reason for contract disputes. We have analyzed a large dataset from a major SHA to identify project related and environmental factors that affect the change order costs. The results of the study can be instrumental in assessing the increased costs associated with change orders and better management measures can be taken to mitigate their effects.
Resumo:
Our research aims to answer the research questions “How do we commonly describe the global start-ups profile as evidenced in prior inductive research?” and “Does this global start-ups profile can effectively explain phenomena in Australian global start-up firms?” We systematically review 29 global start-ups (144 firms) qualitative articles to understand descriptive definitions of global-startup firms. We then triangulate this finding with an Australian high-tech firm. Our contribution is to form a descriptive profile of global start-up phenomenon and raise interesting issues that have potentially fruitful findings for both research and practice. This profile might well be just a deviant from the traditional model that describes how firms establish their footprints, first in their domestic markets followed by moves into cross-border activities. Regardless, government agencies, consultants, and entrepreneurs need to understand the phenomenon. Thus we anticipate that this phenomenon will continue to provide interesting issues for pursuit, both by researchers as well as the practitioner community.