Synchronization of Moodle with the PlagAware Library
PlagAware is an online service that offers a plagiarism check for academic texts such as Bachelor's or Master's theses. In the course of the plagiarism check, websites on which sources of the test text are suspected may be read out.
The most important facts at a glance
- check_circleThe PlagAware bot identifies itself by the UserAgent Mozilla/5.0 (compatible; PlagAwareBot).
- check_circleThe PlagAware bot does not crawl your site, but reads individual pages of your website where possible matches with a check text are suspected.
- check_circleThe PlagAware bot adheres to the directives in the robots.txt file and can thus be controlled.
What does the PlagAware Bot do?
PlagAware is the leading German online service for checking academic texts for plagiarism. In particular, PlagAware is used to check texts from teaching, such as seminar papers, bachelor's theses, master's theses or dissertations for possible plagiarism.
In the course of the plagiarism check, individual text sections of the test text are passed to search engines in order to identify websites and other sources where similar text content is suspected. Based on the results of the search engines, websites that represent possible sources of the test text are automatically read out and compared with the respective test text. The task of the PlagAware bot is to read out the websites identified. This means that no complete websites are crawled, but rather individual pages are captured in order to compare them with the check text.
How does the PlagAware Bot behave?
As described, the PlagAware bot does not crawl the entire website, but instead targets individual pages that were identified as potentially relevant in a previous web search. In this way, traffic is minimized and no additional load is generated for the selected page.
The read text content is temporarily stored (cached) for 48 hours. This prevents the page read from being contacted and read repeatedly, which also helps to reduce traffic.
The PlagAware Bot is uniquely identified by the UserAgent Mozilla/5.0 (compatible; PlagAwareBot). In combination with the robots.txt file, the PlagAware Bot can thus easily be instructed not to record individual pages, areas or the entire website.
The PlagAware Bot respects the robots.txt file. This file allows webmasters to specifically exclude bots from individual pages, areas or the entire website.
How can I exclude the PlagAware bot?
A detailed description of how bots can be excluded from individual pages, areas of the website or the entire website based on their UserAgent identifier can be found, for example, in the Wikipedia article on the Robots Exclusion Standard.
Where can I obtain further information?
If you have any questions about the PlagAware Bot, suspected misuse or other issues, please feel free to contact us at any time. The easiest way to do this is via our contact form.