Peer Reviewed Journal Should be Demoted in the Age of Big Data to Avoid Closed Source Manipulation of Data, Mix up With Bad Data and For Security. Those who are still not aware about NSA Spyware activities and possible relationship with HeartBleed can read this linked article. The Journals are usually published from U.S. – the main culprit behind Governmental Spyware Activities. Demotion of Peer Reviewed Journal not only can close the chance of data theft of the aware authors, mainly attached with the renowned Universities, it can also promote Free Software culture, publication essentially should be Revision Control Software based.
Peer Reviewed Journal Should be Demoted in the Age of Big Data : Risks For The General Users
With the advent of Social Networking and Mobile Devices, it is known that from Retweet, Favs, Likes to Followers – all basically can be “purchased”. There are tools for detecting fake followers (in Twitter) too. In the same way, there are Social Media Marketing Tools for Academic Articles too. There are professional dissertation/thesis editors (like copy editors) available, ghost thesis writers are available, citation can be purchased. Except for Patents (non-software, business critical), master’s theses and doctoral dissertations, dissertation/thesis when published via Third Party websites has been a risk factor for various reasons :
- The Journal websites usually use non-Free, poorly coded softwares, vulnerable to attacks
- The related methods to make a kind of Semantic Web uses the closed sourced softwares and packages
- No warranty of Author’s identity security is provided by the major journals
- Most Journals, specially for health, practically controlled by closed source Text Book Publication
- Data Procured and processed are never attached in a cross verifiable way
- In near 2015, there is no meaning of differentiating between various disciplines for real development
- “Peer” basically becomes “known persons” after a time
- Journal websites are targets of the Pharma Spammers
- Journal websites are web spams, most has no readable content without membership
Huge bad data produced by these Journals basically demands a huge manual work in Big Data area to process. Rofecoxib is the pioneer like NSA’s PRISM to reveal many data manipulation :
---
1 | http://en.wikipedia.org/wiki/Rofecoxib |
Nearly 140,000 people died out of data manipulation is not of lesser importance than PRISM. Many Orthopedic Surgeon’s name was falsely included in fake trials :
1 | http://www.aaos.org/news/aaosnow/apr09/cover1.asp |
Apart from the chance of data theft, lobbying, identify theft, promotion of closed sourced structure, basically such facts clearly shows – Peer Review does not work. It is very difficult to publish a simple blog post on how to install Nginx without being right. It is more dangerous to publish a Plugin or Free Software without really working.
Peer Reviewed Journal Should be Demoted in the Age of Big Data : Demotion is Not Easy
It is very difficult to demote Peer Reviewed Journal as Authentic Source of information, despite the facts – they are not quite easy, funny to read; access usually demands a Fee and other clauses written above. The basic reason of difficulty in demotion is the corruption based money earning related with these Journals. A verifiable data set should look like this :
1 | Abhishek Ghosh 30 years 300 cm admin@thecustomizewindows +919333212726 |
Practically when a proper consent is required to conduct health related “research”, revealing the identity of the participant should be a big point to verify later by anyone. If the above example had no person’s name, it is never possible to really run a “Peer Review”. The old Treatises never actually needed a “Journal”. Except for the usage for master’s theses and doctoral dissertations, they are not quite useful. In the Journals, you’ll get data in this format :
Among 33409 peoples who were of 300 cm, 33 essentially died (p value 0.00001).
Basically, in that format, it is not possible to convert the data to any database suitable data. Whether, PHP random was used to generate the number, keeping the p value as constant, it is questionable! I myself have seen such thesis! The reason is – if the values are not within a normal range, others will not love. They usually check the digits. Without doing a bit blackhat, it is not possible to get the degrees within a practical time in life! Google has Disavow tools, but to really correct 100 years old data, you have to find DNA from graveyard, at worse – cremation ground. In that way, “Human does not grow significantly taller after 18 years.” It is practically never happens. Now, if someone conducts research, it becomes “Changing Trends of Growth Pattern in last 2 decades.” All of us happily sign to pass such students’ paper, we never worry – it becomes difficult to write a Text Book. Again, the problem introduced by colonial method of education. In post colonial age, a case representation is a successful blog post.
If you want to want to publish your dissertation on any Peer Reviewed Journal with great number of Citations, pay my $5000, I will definitely make it published. It does not matter, whether the research was ever done or not. The Editors also need money, links from Wikipedia, may be a Wikipedia page, some other advantages.
Tagged With big data peer reviwed jornals , journals of big data , peer review on big data usage in IT