Archive for ‘Thema2 Recherche académique’

15 février 2019

An Introduction to Jeffreys’s Bayes Factors With the SumStats Module in JASP: Part 1 | Bayes methods from: jasp-stats.org

In this blog post we elaborate on the ideas behind Harold Jeffreys’s Bayes factor and illustrate this test with the Summary Statistics module in JASP.

In a previous blog post we discussed the estimation problem, where the goal was to infer, from the observed data, the magnitude of the population effect. Before studying the size of an effect, however, we arguably first need to investigate whether an effect actually exists. Here we address the existence problem with a hypothesis test and we emphasize the difference between testing and estimation.

The outline of this blog post is as follows: Firstly, we discuss a hypothesis proposed in a recent study relating fungal infections to Alzheimer’s disease. This hypothesis is then operationalized within a statistical model, and we discuss Bayesian model learning in general, before we return to the Alzheimer’s example. This is followed by a comparison of the Bayes factor to other methods of inference, and the blog post concludes with a short summary.

More here

5 février 2019

A great blog to follow about the science crisis ! Replications, NHST paradgim, meta-reviews, editorial policy, academic retractations, publish or perish effects, etc … If you believe science has a huge problem and must change, then your are welcome!

“I am standing for this role because I believe that psychology faces one of two possible futures. In one, we fail to reform our research culture and diminish.”

— NeuroChambers: My manifesto as would-be editor of Psychological Science

 

http://scienceincrisis.info/

 

25 janvier 2019

A great channel …. StatQuest with Josh Starmer

25 janvier 2019

The JASP Guidelines for Conducting and Reporting a Bayesian Analysis

Last week we submitted a paper with guidelines for conducting and reporting a Bayesian analysis (with a focus on JASP). You can find a preprint here.

The Abstract

Despite the increasing popularity of Bayesian inference in empirical research, few practical guidelines provide detailed recommendations for how to apply Bayesian procedures and interpret the results. Here we offer specific guidelines for four different stages of Bayesian statistical reasoning in a research setting: planning the analysis, executing the analysis, interpreting the results, and reporting the results. The guidelines for each stage are illustrated with a running example. Although the guidelines are geared toward analyses performed with the open-source statistical software JASP, most guidelines extend to Bayesian inference in general.

https://jasp-stats.org/2019/01/23/preprint-the-jasp-guidelines-for-conducting-and-reporting-a-bayesian-analysis/?fbclid=IwAR0FzSDPqmYC63QTlHP6BQ4NB8D18Ex204c0YHlRVXllQbOivxbKbM4lC1Q

22 janvier 2019

Les chemins de la censure – La réactance psychologique

Les chemins de la censure: la réactance Technicien propose à Gull d’explorer davantage l’île, focalisant toute son attention sur une mystérieuse forêt du nom de « La forêt censurée ». Mais à l’excitation de Technicien s’oppose un brutal refus de la part de Gull qui lui ordonne, non sans menace, de ne jamais y mettre les pieds. Encore plus intrigué, Technicien décide de braver l’interdit et de partir seul à l’aventure vers cette région interdite sans imaginer à quel point ce périple le transformera profondément, pour le meilleur ou pour le pire. Ce chapitre traite de la censure, non pas sur les objets censurés, ni sur sa pratique ou ses acteurs, mais sur les effets psychologiques qu’elle peut avoir sur l’individu. La censure peut se traduire comme une privation, et peut en cela générer une réactance. La réactance psychologique, mise en évidence par Brehm en 1966, est un état de tension ou de motivation qui apparaît dans une situation où l’on prive l’individu d’un choix, d’une liberté, d’une action possible ou d’un comportement. C’est sous la perspective de la réactance que nous allons interroger la censure dans ce chapitre. Pour retrouver les sources, articles complémentaires, vous pouvez vous rendre sur notre site: https://www.hacking-social.com/2019/0…

13 janvier 2019

Modern science has mysterious roots based on hermetic teachings and occult practices! And the source is Britain who Invented the Modern World conception!

Herman, A. (2001). How the Scots invented the Modern World: the true story of how western Europe’s poorest nation created our world & everything in it. Crown.

This subject is of paramount importance. Indeed, the mysterious appearance of the dynamics of modern science can be linked to Kabalistic practices from ancient Egypt. All the people who were pioneers in scientific fields were linked to these forms of occultism, and modern science has bleached all that esoteric past that is the source of what is called the scientific revolution.
The secret societies, the Kabalistic practices, were transformed in England into scientific and scientific societies, with the appearance of the royal society set up by Francis Bacon.

The mastery of the material world, brings us to the end of the path with the current knowledge that the current world is a universe of interacting energies. To what extent is modern science (which is supposed to control matter) not a form of magic that controls the energies around us?

The matrix was not a movie … this topic deserves more exploration …. do you see what i see neo?  

25 novembre 2018

Corrupt Research: The Case for Reconceptualizing Empirical Management and Social Science

Hubbard, R. (2015). Corrupt research: The case for reconceptualizing empirical management and social science. Sage Publications.

I’m reading this great book written by Hubbard a great marketing scholar about the current NHST paradigm in research. The author is defending the idea that we are producing « unreliable science » that is absolutely useless! Summarizes lots of evidence that the pervasive use of NHTS (Null Hypothesis statistical testing) is bad practice. The author suggests another model and defends the same ideas of Geoff Cumming about the new statistics based on confidence intervals instead of p values and open science in his book of 2013 Understanding the new statistics: Effect sizes, confidence intervals, and meta-analysis. Routledge.

The book of Hubbard is a major contribution and alert toward the research community about the actual research paradigm that we can qualify as an « Ostrich politic » where publishers and academics waste ressources (time, money, effort) for advancing their careers through « fake or misleading » publications instead of seeking to change the system and to seek the truth! this is the noble objective of science, not citations, or rankings! We need a cummulative system that allows replication and integration of findings.

As expected no one is a prophet is his own country, like any established community, gatekeepers don’t like people who go against the mainstream doxa, and this book received little interest while it’s a major contribution and very urgent issue in academic research. Few are those who have the courage and honesty to say it loudly like the excellent positions taken by Barnett , or Kuntz , or Cumming ! but history will give credit to those who stood with a backbone for the truth.

I follow many groups in various disciplines, and this issue will be the TRUE CHALLENGE for the next years if we want to tackle the credibility of scientific research!

YB

 

5 octobre 2018

#Hoax: three scholars wrote 20 fake papers using fashionable jargon and what it reveals about academia! #Publish&Perish

James A. Lindsay, Helen Pluckrose, and Peter Boghossian, the scholars behind the hoax

Over the past 12 months, three scholars—James Lindsay, Helen Pluckrose, and Peter Boghossian—wrote 20 fake papers using fashionable jargon to argue for ridiculous conclusions, and tried to get them placed in high-profile journals in fields including gender studies, queer studies, and fat studies. Their success rate was remarkable: By the time they took their experiment public late on Tuesday, seven of their articles had been accepted for publication by ostensibly serious peer-reviewed journals. Seven more were still going through various stages of the review process. Only six had been rejected.

We’ve been here before.

In the late 1990s, Alan Sokal, a professor of physics … read more here

19 septembre 2018

2018-2020 Research Priorities

Download here

Marketing Science Institute (2018), “Research Priorities 2018-2020” Cambridge, Mass.: Marketing Science Institute.

7 septembre 2018

Why academic Writing in Knitr & R markdown is the new trend? #Reproductible_Research #MarketingThema

In these videos, the authors explain the advantages of academic writing in plain text using markdown syntax. they also give a live demonstration of how writing in markdown works.

 

24 août 2018

In praise of Evidence Based Research: How even Top Tier university researchers are getting mistakes published & reviewers not spotting it

Professor Andrew Gelman presented at the 7th ESRC Research Methods Festival, 5-7 July 2016, University of Bath

Using previous research is important, but do we ever aknowledge that many studies even published in top journals, by top tier researchers are never replicated, contains errors of analysis not spotted by reviewers, may contain false positives or negatives (Error type I & II) but we seem to forget all of that.

In this presentation we have some examples of Pr. Gelman confirming some of the misuse or bad analysis published. The culture of publish or perish is pushing towards an abnormal rate of scientific production and quality, control and time to weight and ponder about research is no longer available. In a recent article some researchers argue and the defend the idea that we must stop producing any new research before doing what they call « evidence based research » to ensure if we have already searched ALL the previous research and stop wasting financial, human ressources and time in conducting the same research or not using systematic reviews. They published a join statement calling for a more efficient research that uses wisely the available ressources.

Read the statement:  

Lund, H., Brunnhuber, K., Juhl, C., Robinson, K., Leenaars, M., Dorch, B. F., … & Chalmers, I. (2016). Towards evidence based research.

21 août 2018

Have you heard before about P-Hacking? Then listen to this video of why most Published Research Wrong

14 août 2018

Economics Is Not Rocket Science – It’s Even More Complicated || #MarketingThema

Authored by Gary Galles via The Mises Institute,

Over the years, I have heard multiple different things described as “not rocket science.” The implication was always that rocket science was just about the hardest thing to do, making virtually everything else easy by comparison. As an economics professor over many of those years, I have increasingly come to object to that characterization. I think the questions of social coordination that economics addresses may not require “rocket science,” but are in many ways much more complex and difficult, especially when it comes to imposing control.

After all, we have successfully sent rockets to many places in our planetary neighborhood, demonstrating a tolerable ability to solve enough of the relevant problems, yet economic policies remain known for causing more harm than help.

As Peter Boettke once led off a post, “Political economy ain’t rocket science. But it is a discipline that forces one to focus on ideas and the implementation of ideas in public policies.” And the more one tries to control, the more those ideas and implementation issues stack up against the possibility, much less the probability, of effectiveness.

In one important sense, rocket science is simply vector addition of the relevant forces. And the relevant relationships for generating rockets’ thrust are governed by physical laws and relationships that are stable and mathematically expressible. That is why one website deviated from rocket science orthodoxy, under the title “Rocket science is easy; rocket engineering is hard.” The problem is one of accurately measuring the needed information and controlling the relevant forces—that is, engineering things (often with millions of parts) so that they work as intended.

Read more here from the source 

7 juillet 2018

A great new ressource for those using R ! An extension of the `ggplot2` package for creating graphics with details from statistical tests included

ggstatsplot` is an extension of the `ggplot2` package for creating graphics with details from statistical tests included in the plots themselves and targeted primarily at behavioral sciences community to provide a one-line code to produce information-rich plots, which can either be used for quick data exploration or for publications/reports/notebooks/etc:
https://cran.r-project.org/…/packages/ggstatsplot/index.html

Currently, it supports only the most common types of statistical tests (parametric, nonparametric, and robust versions of t-tets/ANOVA, correlation, and contingency tables analyses):

– violin plots (for comparisons *between* groups or conditions),
– pie charts (for categorical data),

read more »

5 juillet 2018

What is an epistemicide? Decolonization of knowledge

A simple example, to start with! Each time when in your bibliography or scientific work you cite 90% of the time authors from the north despite the fact that similar works are conducted in the south, you are committing an epistemicid! You are making other productions invisible and marginal by giving an exclusive legitimacy to only some circles of knowledge production.

Another example, is the Islamic tradition of the « halaqa » in producing knowledge and as a research and academic methodology. A form or method that existed in the Islamic Spanish era, and that no longer exists today in the academic world.

A definition according to Quora: « Its a systematic destruction of any indigenous knowledge base. Any knowledge which doesn’t converge with the perpetrator’s knowledge system. It doesn’t believe in fusion or exchange of knowledge but complete disregard of the other’s knowledge. »