An Introduction To Open Source Intelligence (OSINT) Gathering


The revolution of the Internet has turned the world into a small village. Unleashing the Internet network to billions of people worldwide to communicate and exchange digital data has shifted the entire world into what is now an information age. Open Source Intelligence (OSINT) refers to all the publicly available information.

There is no specific date on when the term OSINT was first proposed; however, a relative term has probably been used for hundreds of years to describe the act of gathering intelligence through exploiting publicly available resources.

In recent history, OSINT was introduced during World War II as an intelligence tool by many nations security agencies, however, with the explosive growth of the internet communications and the huge volume of digital data produced by the public worldwide, OSINT gathering becomes a necessity for different organizations, for instance, government departments, nongovernmental organization, (NGO) organizations, and business corporations are starting to rely to a large extent on OSINT rather than private and classified information.

OSINT sources are distinguished from other forms of intelligence because they must be legally accessible by the public without breaching any copyright or privacy laws. This distinction makes the ability to gather OSINT sources applicable to more than just security services. For example, businesses can benefit from exploiting these resources to gain intelligence about their competitors.

OSINT types

OSINT includes all publicly accessible sources of information. This information can be found either online or offline:

  1. The Internet, which includes the following and more: forums, blogs, social networking sites, video-sharing sites like YouTube.com, wikis, Whois records of registered domain names, metadata and digital files, dark web resources, geolocation data, IP addresses, people search engines, and anything that can be found online.
  2. Traditional mass media (e.g., television, radio, newspapers, books, magazines).
  3. Specialized journals, academic publications, dissertations, conference proceedings, company profiles, annual reports, company news, employee profiles, and résumés.
  4. Photos and videos including metadata .
  5. Geospatial information (e.g., maps and commercial imagery products)

OSINT Organizations

Some specialized organizations provide OSINT services. Some of them are government based, and others are private companies that offer their services to different parties such as government agencies and business corporations on a subscription basis. The following are the most well-known OSINT organizations:

Government Organizations

  1. Open Source Center: controlled by U.S. government
  2. BBC Monitoring (https://monitoring.bbc.co.uk/login) is a department within the British Broadcasting Corporation (BBC) that monitors foreign media worldwide. BBC Monitoring is directed by the BBC and offers its services on a subscription basis to interested parties such as commercial organizations and UK official bodies.

Private Sector

  1. Jane’s Information Group (http://www.janes.com) is a British company founded in 1898. Jane’s is a leading provider that specializes in military, terrorism, state stability, serious and organized crime, proliferation and procurement intelligence, aerospace, and transportation subjects
  2. The Economist Intelligence Unit (https://www.eiu.com/home.aspx) is the business intelligence, research, and analysis division of the British Economist Group.
  3. Oxford Analytica (http://www.oxan.com) is a relatively small OSINT firm compared with the previous two. Oxford Analytica specializes in geopolitics and macroeconomics subjects.

Parties Interested in OSINT Information

OSINT can be beneficial for different actors. We will briefly list them and mention what motivate each one to search for OSINT resources.

  1. Government: Government bodies, especially military departments, are considered the largest consumer of OSINT sources. Governments need OSINT sources for different purposes such as national security, counterterrorism, cybertracking of terrorists, understanding domestic and foreign public views on different subjects, supplying policy makers with required information to influence their internal and external policy, and exploiting foreign media like TV to get instant translations of different events happening outside.
  2. International Organizations: International organizations like the UN use OSINT sources to support peacekeeping operations around the globe. Humanitarian organizations, like the International Red Cross, use OSINT sources to aid them in their relief efforts in a time of crisis or disaster. They use OSINT intelligence to protect their supply chain from terrorist groups by analyzing social media sites and Internet messaging applications to predict future terrorist actions.
  3. Law Enforcement Agencies: Police use OSINT sources to protect citizens from abuse, sexual violence, identity theft, and other crimes. This can be done by monitoring social media channels for interesting keywords and pictures to help prevent crimes before they escalate.
  4. Business Corporations: Information is power, and corporations use OSINT sources to investigate new markets, monitor competitors’ activities, plan marketing activities, and predict anything that can affect their current operations and future growth.
    Businesses also use OSINT intelligence for other nonfinancial purposes such as the following:
    A. To fight against data leakage, knowing that the business exposure of confidential information and the security vulnerabilities of their networks is a cause of future cyber-threats.
    B. To create their threat intelligence strategies through analyzing OSINT sources from both outside and inside the organization and then combining this information with other information to accomplish an effective cyber-risk management policy that helps them to protect their financial interests, reputation, and customer base.
  5. Penetration Testers and Black Hat Hackers/Criminal Organizations: OSINT is used extensively by hackers and penetration testers to gather intelligence about a specific target online. It is also considered a valuable tool to assist in conducting social engineering attacks. The first phase of any penetration testing methodology begins with reconnaissance (in other words, with OSINT).
  6. Privacy-Conscious People: These are ordinary people who might want to check how outsiders can break into their computing devices and what their ISP knows about them. They also need to know their online exposure level to close any security gap and delete any private data that may have been published inadvertently. OSINT is a great tool to see how your digital identity appears to the outside world, allowing you to maintain your privacy. Individuals can also use OSINT to fight against identity theft, for example, in case someone is impersonating you.
  7. Terrorist Organizations: Terrorists use OSINT sources to plan attacks, collect information about targets before attacking them (like when using satellite images such as Google Maps to investigate the target location), procure more fighters by analyzing social media sites, acquire military information revealed accidentally by governments (like how to construct bombs), and spread their propaganda across the world using different media channels.

Information Gathering Types

OSINT sources can be collected using three main methods: passive, semi-passive, and active. The usage of one in favor of another is dependent on the scenario in which the gathering process operates in addition to the type of data that you are interested in.

Passive Collection

This is the most used type when collecting OSINT intelligence. Indeed, all OSINT intelligence methods should use passive collection because the main aim of OSINT gathering is to collect information about the target via publicly available resources only.

Semi-passive

From a technical view, this type of gathering sends limited traffic to target servers to acquire general information about them. This traffic tries to resemble typical Internet traffic to avoid drawing any attention to your reconnaissance activities. In this way, you are not implementing in-depth investigation of the target’s online resources, but only investigating lightly without launching any alarm on the target’s side.

Active Collection

In this type, you interact directly with the system to gather intelligence about it. The target can become aware of the reconnaissance process since the person/entity collecting information will use advanced techniques to harvest technical data about the target IT infrastructure such as accessing open ports, scanning vulnerabilities (unpatched Windows systems), scanning web server applications, and more. This traffic will look like suspicious or malicious behavior and will leave traces on the target’s intrusion detection system (IDS) or intrusion prevention system (IPS). Conducting social engineering attacks on the target is also considered a type of active information gathering.

Benefits of OSINT

The benefits of OSINT span many areas in today’s world. The following are the main ones:

  1. Less risky: Using publicly available information to collect intelligence has no risk compared with other forms of intelligence such as using spying satellites or using human resources on the ground to collect information, especially in hostile countries.
  2. Cost effective: Collecting OSINT is generally less expensive compared with other intelligence sources. For instance, using human resources or spying satellites to collect data is costly.
  3. Ease of accessibility: OSINT sources are always available, no matter where you are, and are always up-to-date.
  4. Legal issues: Most OSINT resources can be shared between different parties without worrying about breaching any copyright license as these resources are already published publicly.
  5. Aiding financial investigators: OSINT allows specialized government agencies to detect tax evaders, for instance. Many famous celebrities and some giant companies are involved in tax evasion, and monitoring their social media accounts, vacations, and lifestyles has a great value for a government inspector who may be chasing them for undeclared income.
  6. Fighting against online counterfeiting: OSINT techniques can be used to find false products/services and direct law enforcement to close such sites or to send warnings to users to stop dealing with them.
  7. Maintaining national security and political stability: This might be the most important role of OSINT; it helps governments to understand their people’s attitudes and to act promptly to avoid any future clashes.

Challenges of Open Source Intelligence

All intelligence gathering methodologies have some limitations, and OSINT is not exempt from this rule. The following are the main challenges that face OSINT gathering:

  1. Sheer volume of data: Collecting OSINT will produce a huge amount of data that must be analyzed to be considered of value. Of course, many automated tools exist for this purpose, and many governments and giant companies have developed their own set of artificial intelligence tools and techniques to filter acquired data. However, the tremendous volume of data will remain a challenge for the OSINT gatherer.
  2. Reliability of sources: Bear in mind that OSINT sources, especially when used in the intelligence context, need to be verified thoroughly by classified sources before they can be trusted.
  3. Human efforts: As we already mentioned, the sheer volume of data is considered the greatest challenge for OSINT collection. Humans need to view the output of automated tools to know whether the collected data is reliable and trustworthy; they also need to compare it with some classified data (this is applicable for some military and commercial information) to assure its reliability and relevance. This will effectively consume time and precious human resources.

Summary

In this article we tried to shed the light on the essence of OSINT, its types and users, and how it can be used in different contexts by different parties to gain intelligence. In the next series of articles, we delve deeply and demonstrate different techniques and tools that can be used to locate information online.

Final Note: To read more about OSINT gathering methods and tools, you can read author book titled: Open Source Intelligence Methods and Tools: A Practical Guide to Online Intelligence Published by Apress | https://www.apress.com/gp/book/9781484232125 A companion website is also available for this book and can be found at http://www.OSINT.link

Main Image Credit : The awesome piece of artwork used to head this article is called ‘Research’ and it was created by graphic designer Miroslav Kostic.



Source

15 Penetration Testing Tools-Open Source

In footprinting or reconnaissance phase, a penetration tester collects as much information as possible about the target machine. The primary purpose of this phase is to gather intelligence so as you can conduct an effective penetration test. At the end of his phase, you are expected to have a list of IP of your target machine that you can scan later on.

penetration testing tools

Reconnaissance /footprinting tools

Reconnaissance can be either active or passive. In active reconnaissance you send traffic to the target machine while a passive reconnaissance use Internet to gather information. When you use active reconnaissance, you need to remember that the target machine may notice that you are planning a penetration test. In the case of passive test, target machine has no clue about who is gather intelligence and planning an attack. The following are the tools you can use:

  1. Google: use advanced Google search to gather information about the target’s website, webservers and vulnerable information. Sometimes, jobs posted in the companies’ websites reveal valuable information about the type of information technologies used in the target company.
  2. The harvester: you can use it to catalogue email address and subdomains. It works with all the major search engines including Bing and Google. This is a build in tool of Kali Linux.
  3. WHOIS: to get information about domains, IP address, DNS you can run whois command from your Linux machine. Just type whois followed by the domain name:

Whois yourdomain.com

Alternatively, you can visit whois.net and type the domain name of your target.

  1. Netcraft: they have a free online tool to gather information about webservers including both the client and server side technologies. Visit http://toolbar.netcraft.com/site_report/ and type the domain name.
  2. Nslookup: you can use it to query DNS server in order to extract valuable information about the host machine. You can use this tool both in Linux and Windows. From your windows machine, open the command prompt and the type ‘nslookup’ followed by the domain name.
  3. Dig: another useful DNS lookup tool used in Linux machine. Type dig followed by the domain name.
  4. MetaGoofil: it’s a meta data collection tool. Meta data means data about data. For instance, when you create word document in Microsoft word, some additional information are added to this word file such as file size, date of creation, the user name of the creator etc.-all these additional information is called meta data. MetaGoogle scours the Internet for metadata of your target. You can use it with both Linux (built in Kali Linux) and Windows.
  5. Threatagent drone: it is a web based tool. You need to signup at https://www.threatagent.com/ and type the domain name that you want to reconnaissance. Once the drone extracts all the information about your target, it will create a complete report about the target, which will include the IP address range, email address, point of contacts etc.
  6. Social engineering: it is perhaps the easiest way to gather information about an organization. You can find lots of free information about social engineering in the Internet. Depending on the types of information you need about your target organization, you need to choose the appropriate technique. But remember that this technique needs time to master and you need to plan it very carefully, otherwise your activity can easily trigger an alert.

After gathering solid information about the target, the next   step is to start scanning the target system.

 

Scanning Tools

A pen tester scans the target machine in order to find the weakness in the systems. The two major activities of the scanning phase are port scanning and vulnerability scanning.

Port scanning helps to identify a list of opened ports in the target and based on the list of ports you can determine what types of services are running in the system.

The second step in scanning is to run a vulnerability scan to identify specific weakness in the software and services running in the servers.

At the end of port scan you will have the following information:

  • Number and type of opened ports
  • Type of services running in the servers
  • Vulnerabilities of the services and software

10. Nmap

If you have doubt about which tool to use for scanning, use Nmap. This tool creates a complete list of opened ports in your target. You can use is both in Windows and Linux environment. The graphical interface for Windows is called Zenmap, which you can run without learning any command. But, for greater control and granularity for the output, you need to learn the commands.

11. Nessus

Once you find the list of open ports, the next step is start looking for vulnerability in the servers. One of the efficient tools to vulnerability scan is Nessus. Remember that Nessus is not a free tool.

12. Nexpose: if you are looking for a free vulnerability scanner, you can use nexpose community edition from rapid7.

 

Penetration testing/exploitation

This is the most important phase of a penetration test, which is also known as exploitation because a pen tester makes real attempts to gain access to the target system at this phase.

13. MEDUSA: you can use it to gain to the authentication services in the target machine. Medusa can authenticates with a number of popular services such as FTP, HTTP, IMAP, MS SQL, MySQL, PCAnywhere, POP3, RLOGIN, SMTPM, Telnet, SSH, VNC etc. before using Medusa you need to have several information in your hand such as username, target IP address, a password file( a dictionary file containing a list of popular and widely used passwords).

14. Hydra: this is another useful tool like Medusa used to break authentication system.

15. Metasploit: it can be considered one of the finest open source exploit in the world. The best thing about Metasploit is that it is free. If you are planning to become an open tester and what to learn exploitation, you can start using metasploit without any hesitation. But remember that exploitation tools are not vulnerability scanner. When you use a vulnerability scan, it will report you about the weakness in the system without causing any damage in the system. In that sense, a vulnerability scanner in a passive tool. On the other hand, an exploitation tool like Metasploit is a real exploit. When an exploitation tool discovers any vulnerability, it exploits it immediately, which may cause severe damage to the system or can cause network disruption. So, take extra care when playing with any such tools.

Words of Caution: never try to use the above mentioned tools in a network or system without authorization from the proper authority. The intention of the post is to help the IT professional who also wants to learn and develop a career in penetration testing.

Related Posts:

Source

Nmap: the Network Mapper – Free Security Scanner

Nmap (“Network Mapper”) is a free and open source (license) utility for network discovery and security auditing. Many systems and network administrators also find it useful for tasks such as network inventory, managing service upgrade schedules, and monitoring host or service uptime. Nmap uses raw IP packets in novel ways to determine what hosts are available on the network, what services (application name and version) those hosts are offering, what operating systems (and OS versions) they are running, what type of packet filters/firewalls are in use, and dozens of other characteristics. It was designed to rapidly scan large networks, but works fine against single hosts. Nmap runs on all major computer operating systems, and official binary packages are available for Linux, Windows, and Mac OS X. In addition to the classic command-line Nmap executable, the Nmap suite includes an advanced GUI and results viewer (Zenmap), a flexible data transfer, redirection, and debugging tool (Ncat), a utility for comparing scan results (Ndiff), and a packet generation and response analysis tool (Nping).

Nmap was named “Security Product of the Year” by Linux Journal, Info World, LinuxQuestions.Org, and Codetalker Digest. It was even featured in twelve movies, including The Matrix Reloaded, Die Hard 4, Girl With the Dragon Tattoo, and The Bourne Ultimatum.

Nmap is …

    Nmap Security Port Scanner

  • Flexible: Supports dozens of advanced techniques for mapping out networks filled with IP filters, firewalls, routers, and other obstacles. This includes many port scanning mechanisms (both TCP & UDP), OS detection, version detection, ping sweeps, and more. See the documentation page.
  • Powerful: Nmap has been used to scan huge networks of literally hundreds of thousands of machines.
  • Portable: Most operating systems are supported, including Linux, Microsoft Windows, FreeBSD, OpenBSD, Solaris, IRIX, Mac OS X, HP-UX, NetBSD, Sun OS, Amiga, and more.
  • Easy: While Nmap offers a rich set of advanced features for power users, you can start out as simply as “nmap -v -A targethost“. Both traditional command line and graphical (GUI) versions are available to suit your preference. Binaries are available for those who do not wish to compile Nmap from source.
  • Free: The primary goals of the Nmap Project is to help make the Internet a little more secure and to provide administrators/auditors/hackers with an advanced tool for exploring their networks. Nmap is available for free download, and also comes with full source code that you may modify and redistribute under the terms of the license.
  • Well Documented: Significant effort has been put into comprehensive and up-to-date man pages, whitepapers, tutorials, and even a whole book! Find them in multiple languages here.
  • Supported: While Nmap comes with no warranty, it is well supported by a vibrant community of developers and users. Most of this interaction occurs on the Nmap mailing lists. Most bug reports and questions should be sent to the nmap-dev list, but only after you read the guidelines. We recommend that all users subscribe to the low-traffic nmap-hackers announcement list. You can also find Nmap on Facebook and Twitter. For real-time chat, join the #nmap channel on Freenode or EFNet.
  • Acclaimed: Nmap has won numerous awards, including “Information Security Product of the Year” by Linux Journal, Info World and Codetalker Digest. It has been featured in hundreds of magazine articles, several movies, dozens of books, and one comic book series. Visit the press page for further details.
  • Popular: Thousands of people download Nmap every day, and it is included with many operating systems (Redhat Linux, Debian Linux, Gentoo, FreeBSD, OpenBSD, etc). It is among the top ten (out of 30,000) programs at the Freshmeat.Net repository. This is important because it lends Nmap its vibrant development and user support communities.

Nmap users are encouraged to subscribe to the Nmap-hackers mailing list. It is a low volume (6 posts in 2017), moderated list for the most important announcements about Nmap, Insecure.org, and related projects. You can join more than 128,000 current subscribers by submitting your email address here:

We also have a development list for more hardcore members (especially programmers) who are interested in helping the project by helping with coding, testing, feature ideas, etc. New (test/beta) versions of Nmap are sometimes released here prior to general availability for QA purposes. You can subscribe at the Nmap-dev list info page.

Both lists are archived (along with many other security lists) at Seclists.org.

Though it isn’t nearly as active as the mailing lists, the official IRC channel is #nmap on Freenode (irc.freenode.net).

Source

the open-source database for the realtime web



  • Reactive web and mobile apps

    Web apps like Google Docs, Trello, and Quora pioneered the realtime experience on the web. With RethinkDB, you can build amazing realtime apps with dramatically less engineering effort.


  • Multiplayer games

    When a player takes an action in a multiplayer game, every other player in the game needs to see the change in realtime. RethinkDB dramatically simplifies the data infrastructure for low latency, high throughput realtime interactions.


  • Realtime marketplaces

    RethinkDB dramatically reduces the complexity of building realtime trading and optimization engines. Publish realtime updates to thousands of clients, and provide pricing updates to users in milliseconds.


  • Streaming analytics

    Build realtime dashboards with RethinkDB data push notifications, and make instantaneous business decisions.


  • Connected devices

    RethinkDB dramatically simplifies modern IoT infrastructures. Stream data between connected devices, enable messaging and signaling, and trigger actions in millions of devices in milliseconds.

Work with your favorite stack

Query JSON documents with Python, Ruby, Node.js or dozens of other languages. Build modern apps using your favorite web framework, paired with realtime technologies like Socket.io or SignalR.

Robust architecture

RethinkDB integrates the latest advances in database technology. It has a modern distributed architecture, a highly-optimized buffer cache, and a state-of-the-art storage engine. All of these components work together to create a robust, scalable, high-performance database.

Everything you need to build modern apps

Express relationships using joins, build location-aware apps, or store multimedia and time-series data. Do analytics with aggregation and map/reduce, and speed up your apps using flexible indexing.

Built with love by the open source community

Originally developed by a core team of database experts and over 100 contributors from around the world, RethinkDB is shaped by developers like you participating in an open source community development process.

When you’re ready to scale your app, shard and replicate in a few clicks using an intuitive web UI.

If you need it, a simple API provides precise control over the cluster:

r.table(‘games’).reconfigure(shards=5, replicas=3)

Monitor your production cluster with live statistics and complete visibility into running jobs:

r.db(‘rethinkdb’).table(‘jobs’).changes()Learn more go

Install RethinkDB go Learn more go

  • RethinkDB has the best query language of all new databases I’ve seen. Guillermo Rauch, Cloudup CTO, creator of Mongoose

  • RethinkDB is probably the most interesting new ‘on-disk + complex queries’ database out there. For sure made by people who get it. Salvatore Sanfillipo, creator of Redis

  • At NASA, RethinkDB is radically simplifying how we provide real-time services in support of Extra-Vehicular Activity. Collin Estes, Director of Software Engineering, Chief Architect – MRI Technologies Inc. – NASA / ESOC



Source

PoemHunter.com: Poems – Quotes – Poetry


It was many and many a year ago,
In a kingdom by the sea,
That a maiden there lived whom you may know
By the name of ANNABEL LEE;
And this maiden she lived with no other thought
Than to love and be loved by me.

I was a child and she was a child,
In this kingdom by the sea;
But we loved with a love that was more than love-
I and my Annabel Lee;
With a love that the winged seraphs of heaven
Coveted her and me.

And this was the reason that, long ago,
In this kingdom by the sea,
A wind blew out of a cloud, chilling
My beautiful Annabel Lee;
So that her highborn kinsman came
And bore her away from me,
To shut her up in a sepulchre
In this kingdom by the sea.

The angels, not half so happy in heaven,
Went envying her and me-
Yes! – that was the reason (as all men know,
In this kingdom by the sea)
That the wind came out of the cloud by night,
Chilling and killing my Annabel Lee.

Source

8 Best Private Search Engines in 2021 – True No-Log Services

It’s a deal we all hate, but often feel like we can’t escape: big search engines like Google, Bing, and Yahoo help us find our way around the web, while we let them grab as much information about us as they can.

These search engines are data collection factories with no regard for your privacy. They log your IP address, your search terms, which results you click on, how many times you bounce back to the results page or modify the search, and much, much more.

All this data allows them to create a “user map,” a summary of your browsing personality. They sell these maps for billions of dollars a year to advertisers, who then bombard you with targeted ads.

So how can you find what you need online without a company turning you into a product they sell? Using smaller search engines that don’t log your activity is a good start.

We’ve researched the best search engines that protect your privacy, and while they may not be household names, they can certainly do the job if you know how to use them.

In a Hurry? Check Out our Quick Guide for the Best Private Search Engines

  1. Startpage: Uses Google technology without the tracking, to provide an anonymous and smooth browsing experience.
  2. DuckDuckGo: Most well known private search tool, offers a 100% transparent privacy policy.
  3. Swisscows: Uses privately owned servers, and collects search results from multiple engines.
  4. Searx.me: Meta search engine that runs on open source software and encourages you to modify their code for ultimate security.
  5. Disconnect Search: Results are displayed in the style of the search engine they were taken from.

Learn more about all of our top private search engines!

If online privacy matters to you, we strongly recommend getting a top-quality VPN before trying out any of these private search engines. It’s the simplest way to make sure you’re fully protected as you search.

Protecting Your Privacy Online with a Virtual Private Network (VPN)

Private search engines don’t keep, sell, or otherwise use your information. Unfortunately, that won’t stop the sites you visit from doing it. In many countries, your internet service provider (ISP) tracks your activity and sells the data they gather. That means you’re being watched before you even enter a search term.

As most private search engines point out in their terms of service, once you click on a search result, they can’t protect you anymore. Sites you visit can record your IP and track your use. Somewhere, most sites have a privacy policy, but by the time you find it, your data may already be compromised.

A VPN is truly your best line of defense against corporations, hackers, scammers, and government agencies, tracking your device location or using your browsing data. By concealing your true location and replacing it with an IP from its own server array, a VPN prevents third parties from tracking your activity.

The best VPNs protect your data with military-grade encryption and advanced security measures that give you complete anonymity online. Features like IP cloaking, DNS/IPv6 leak protection, ad blockers, and kill switches, are important to protect your anonymity, location, and online data.

Most premium VPN services deliver on all of these features. Plus they offer unlimited data, speed, and bandwidth, military encryption, and most can unblock all the popular streaming services (including Netflix).

Best VPNs to Use with a Private Search Engine

  1. NordVPN
    NordVPN keeps you anonymous with a strict no-logging policy and hides your data with a combination of 256-bit and 2048-bit encryption. Its industry-leading security features include Double VPN, Onion over VPN, and CyberSec, a unique ad and malware blocker.
  2. ExpressVPN
    ExpressVPN markets itself as the fastest premium VPN service available today, but it doesn’t cut corners on security to deliver those impressive speeds. Military-grade encryption, a choice of protocols, and an automatic kill switch work together to keep you hidden from data harvesters.
  3. CyberGhost
    With one of the most user-friendly interfaces in the VPN market, CyberGhost gives you total anonymity online with a single click. An automatic kill switch, AES-256 encryption, and DNS leak protection are built into apps for every major platform, so you’ll be safe on any device you use.

Click on the link to learn more about how to choose a VPN.

Try NordVPN Today!

The Best Private Search Engines

1. Startpage

Startpage landing page

Startpage landing page

Startpage is the first-ever and arguably the world’s most private search engine currently available. It offers robust search results, customizable settings, and leading consumer privacy protection features. 

The experience is very straight-forward and their search results are consistently more relevant than other private search engines. This is most likely because they leverage Google search technology, without any tracking; while other private search engines are largely dependent on Yahoo and BING’s search feed. 

Startpage does exceptionally well in the area of user privacy. It does not track, log any user data or share information with third parties. Their headquarter office and main servers are based in the Netherlands, meaning all global users are protected by Dutch and EU privacy laws which are some of the most stringent in the world. Additionally, Startpage has passed independent (EuroPrise organization) audits of their privacy and data-handling practices.

To make searching simple, Startpage can be set as your default search engine on leading browsers or can be installed as an extension add-on. They offer multiple ways to personalize your settings to make it your own and search the way you want. They also have a very responsive and helpful international support team that patiently assists users with easy to complex issues.

Finally, Startpage offers a very useful private browsing feature called “Anonymous View.” With this proxy feature, users can view images, videos, news, and entire websites with no tracking or trace. Every time you search on Startpage, an “Anonymous View” link will appear next to your search results, by clicking this link they will hide your IP address and user agent to other sites. 

To install Startpage’s private search extension – click here.

Visit Startpage

2. DuckDuckGo

DuckDuckGo landing page

DuckDuckGo landing page

By far the best-known service that markets itself as a private search engine, DuckDuckGo is a powerful metasearch tool that gathers results from over 400 sources, including Yahoo, Bing, and Wikipedia. It is extremely popular, receiving about 14 million search queries a day.

We don’t doubt DuckDuckGo’s value as a search tool, and we appreciate that the website includes a detailed, transparent privacy policy. However, a couple of disclosures in that policy might concern you if 100% anonymity is your goal.

First off, DuckDuckGo saves search histories, although it claims that such saving is “non-personal” and aggregate, so that your searches can’t be tied directly to you. Secondly, in addition to ad revenues, DuckDuckGo makes money from commissions paid by affiliate e-commerce sites like Amazon and eBay.

That is by no means the same as selling your data, but if you prefer a private search engine that operates completely independently from companies known to be aggressive data collectors, you might want to go with one of the other options below.

Regardless, before using DuckDuckGo, make sure to thoroughly review the company’s privacy policy to see what information from your searches will be saved.

Visit DuckDuckGo

3. Swisscows 

swisscows landing page

swisscows landing page

Swisscows’ privacy policy is simple and to the point: “We do not collect any of our visitors’ personal information. None whatsoever.”

It does not record your IP address, browser information, or device information. You search terms are not recorded or analyzed, either. The only data that Swisscows records is the total number of search requests it receives each day.

The search engine uses its own, privately-owned servers and does not rely on any third-party infrastructure. Its DataCenter is located underground in the Swiss Alps, and is protected by Switzerland’s strong privacy and data retention laws. None of its infrastructure is located inside the EU or the US.

So, how does Swisscows make money? It displays sponsors’ banner ads with your search results. These ads are based on your search query, and not your location or search history. They are non-invasive and, in line with Swisscows’ family-friendly status, never contain explicit or sexist content.

Visit Swisscows

4. Searx.me

searX landing page

searX landing page

Technically, Searx.me is a metasearch engine, which means that it gathers results from popular search engines and combines them. Like StartPage, Searx removes any identifying data from your request so that Google and other sites receive the search phrase as an anonymous request.

Searx’s terms of service state that the search engine “doesn’t care what you search for” – in other words, no record of the search or your data will be kept. Unlike StartPage, however, Searx does not include ads on search results pages.

Instead, all results pages, along with several other pages on the Searx.me website, include a link to make a donation in support of the service. The main donations page even summarizes donations received so far (anonymously, of course).

Searx runs on open-source software, and its code is available on Github. In fact, the company actually encourages you to download the code and modify it as you see fit for maximum privacy protection. For the less tech-savvy, customizable Advanced Settings are available on the main search page.

Be aware, however, that if you download the code and run it from your own device, the search results will not be grouped with others and so could be linked directly to your IP. That would of course defeat the whole purpose of using a private search engine.

Searx is also available as a Firefox extension so can use it without navigating to the Searx.me homepage.

Visit Searx.me

5. Disconnect Search

Disconnect Search landing page

Disconnect Search landing page

At least for now, Disconnect Search is an exception to the rule that a company offering you internet services for free must be making money off you in other ways. The company plans to build out the service to include paid options, but the search engine doesn’t currently generate revenue.

Like StartPage and Searx.me, Disconnect Search gets its results from other search engines. The search page allows you to choose the specific search engine it will use – the options are Bing, Yahoo, and DuckDuckGo (see below).

Also like the above providers, Disconnect Search submits your query anonymously so that it cannot be tracked to you, and keeps no record of your searches.

One of the nicer features of this engine is that it displays results in the style of the search engine they come from. This means that results pages are more visually pleasing than the ones you get with the above services.

If you want local results, you can install the Disconnect Search browser extension and search for results by location.

Disconnect also offers a VPN and a private browser. However, due to its many shortcomings, Disconnect’s VPN is not one that we recommend. Click here for a list of our top-rated VPNs.

Visit Disconnect Search

6. MetaGer

MetaGer landing page

MetaGer landing page
Created by a German non-profit NGO, MetaGer is very popular in its home country and is now attracting users around the world. Similar to the above providers, MetaGerconverts your search request into an anonymous querythat it transmits to major search engines.

Unlike those services, however, this private search engine integrates with a proxy server that hides your IP address, allowing it to protect your privacy in ways other search engines can’t.

MetaGer’s results pages include an “open anonymously” option for every link. If you select this option, your transmission will be sent through an anonymous proxy so that the receiving website and any third parties can’t track you.

That privacy protection continues if you follow links on the destination website. It’s still not the same level of security as you would get from a top VPN, but it is a huge step up from handing over all your data to Google day after day.

Like Disconnect Search, MetaGer is supported by user contributions; you will see a prominent Donate button on your results pages. The code behind the site is open source and available for review by anyone.

Visit MetaGer

7. Qwant

Qwant landing page

Qwant landing page
We’ve ranked Qwant here because although it doesn’t relentlessly gather information like Google, it does collect some basic data. The company pledges that it “does not track people, and that’s never going to change,” but its fulfillment of that pledge is a little complicated.

The search engine is free to use without registering. If you do register, you have the option of saving searches or creating favorites. Registration requires providing your name and email address, which Qwant uses to create logs and then provide you with “personalized results.”

All of that starts to sound a little too close to the practices of traditional search engines for comfort. However, Qwant claims that it only saves search queries, that no personal information is ever shared with third parties, and that it never engages in behavioral targeting.

Perhaps that’s all true, but Qwant also acknowledges that its revenue source is advertising, and that ads are generated in cooperation with the Microsoft Bing ad network. An association with Microsoft doesn’t exactly scream “user privacy first” to us.

We are glad, however, that the provider filters out native advertising from results.

With the protection of a VPN, you can try Qwant for yourself and see how well it upholds its promises. If you have children, you might also like Qwant Junior – a search engine geared towards children that blocks unsafe results (such as those containing pornography).

Visit Qwant

8. Yase

Yase landing page

Yase landing page

Yase takes its no-logs promise seriously. The company’s privacy policy states that it doesn’t “share any personally identifying information publicly or with third-parties, except when required to by law”, and that it doesn’t store any user logs or activity. It also specifically states that it does not store IP addresses.

The site uses Bing to answer queries. It also collects information from other sites like Wikipedia. This means you get fast, accurate answers while retaining your privacy – a win-win situation. It offers separate web, images, videos, and news results, and allows users to set filters including Safe Search and file type.

Yase says you should use its service because it offers a no filter bubble, full privacy, and better results. We also think you should use it because it offers a clean and intuitive interface, user-friendly privacy policy, and smart answer system.

Visit Yase

The Bottom Line

Searching the web shouldn’t mean handing your private information over to third parties. Whether the result is just the annoyance of targeted ads or something far more serious like Cambridge Analytica’s tampering with US elections, surrendering your data is too high a price to pay for a “free” search service.

Private search engines help you keep your personal details hidden while you search, and deliver great results by anonymously querying the big sites like Google and Bing. Unfortunately, even when you use the best services listed above, you are still vulnerable to tracking of your activity.

Only a VPN’s end-to-end encryption and location masking can give you the freedom to browse completely anonymously. If you’re new to VPNs, we’ve got a comprehensive guide that explains exactly how the best services keep both you and your devices safe every time you surf the web.

Since our top-rated VPNs offer free trials and money-back guarantees, there is no reason not to find out what a VPN can do for you today. Any of these services will put an end to your worries about who is grabbing your data and what they’re doing with it.

To find out how you can get a premium VPN for a small price, don’t miss out on our regularly updated deals and coupon codes page.

The information above can be used to track you, target you for ads, and monitor what you do online.

VPNs can help you hide this information from websites so that you are protected at all times. We recommend NordVPN — the #1 VPN out of over 350 providers we’ve tested. It has military-grade encryption and privacy features that will ensure your digital security, plus — it’s currently offering  68% off.

Visit NordVPN

Source

Elasticsearch Alerts: 1 day hack to create “watcher” like alerting open-source tool in Node.js


Stefan Thies

The ELK stack (Elasticsearch, Logstash , Kibana) are great tool to collect and analyze data from various sources. Making use of this data requires often alerting mechanisms to notify users about critical business or operational issues discovered by Elasticsearch queries. Unfortunately the open-source ELK stack does not include any tool to schedule queries or notify users. Elastic, the company behind ELK offers commercial software extensions called X-Pack, including X-Pack Alerting (formally called “watcher”). The pricing is not listed on the web-page, however reading this reddit thread it was clear to me that pricing is over the top for many projects and open-source alternatives are rare (e.g. Elastalert by Yelp) — so why not build another Elasticsearch alerting tool and share it on github. I hope other people will contribute and help to make it even better e.g. by contributing more notification functions e.g. for e-mail or Twitter.

What is X-Pack Alerting?

  1. Surprisingly for a commercial product “watcher” has no UI — just a large JSON configuration, stored in Elasticsearch
  2. Watcher schedules Elasticsearch queries
  3. Transforms results
  4. Sends notification to E-Mail or Slack

Sounds easy: schedule queries, do a bit of ETL and format the output for Slack notifications.

As main contributor to Logagent open-source tool — a kind of Logstash made with Node.js (yeah, why use Logstash made in Ruby when you could save 1,5 GB main memory on each server and get the power of async I/O in Node.js …), it was clear to me that I could use Logagent framework to implement “watcher” like functionality with a few simple steps, and so I did:

  1. Create an input plugin for Elasticsearch queries — similar to the command scheduling plugin, running a task with a configurable interval. Elasticsearch query language is a bit ugly with its long JSON queries, so I decided to use YAML format for better maintainability. The resulting plugin has only 85 lines of Node.js code!
  2. Use existing Logagent filter and transform functions
  3. Implement the Slack output plugin. 75 lines of Node.js code!

Done! Final step, create a Logagent configuration to use the new Logagent plugins for Slack Alerting based on any scheduled Elasticsearch query: 67 lines of YAML code!

Let me explain the configuration file, because this is the only thing Logagent users need to customize to create Elasticsearch alerts:

  1. Define a search query for the alert, in this case looking for HTTP error codes from HTTP Logs stored in Elasticsearch, or in this specific case in Sematext Cloud (query & indexing API is compatible to Elasticsearch). We use 1 minute interval for scheduling and a date range query for the last minute. Elasticsearch supports Date math like “now-1m/m” in range queries. Here is my config for the new Logagent input plugin:

Input plugin configuration to schedule Elasticsearch queries

  1. Write a transform function e.g. to rename field names or group results. We will use here Logagents SQL output filter to rename fields or aggregate Elasticsearch results with in-memory SQL. In this case we count errors per HTTP status code (Elasticsearch DSL aggregation queries are sometimes hard to write … and most people know SQL)

Output filter using SQL output filter plugin

  1. Get a URL for Slack API integration here: https://api.slack.com/incoming-webhooks and use it for the Slack Webhook output plugin. The output section includes a JavaScript filter function to implement the alert trigger logic — hey full control to make the trigger decision in a well-known programming language and the option to make complex calculations! Sorry Logagent scripts are not “painless” — just JavaScript 😉

Slack output plugin configuration with trigger logic as filter function in JavaScript

Note the Slack text template it could include any field from the resulting query in the Slack message to create readable messages for the end users.

If you don’t have Logagent installed get Node.js and install it:

> npm install -g @sematext/logagent

Run Logagent with the new configuration:

> logagent — config watchlogs.yml

And wait for the Alert in Slack:

Please note Logagent comes with setup scripts to run as systemd, launchd (Mac OS X), upstart or Windows service or Docker container — so the alert engine could run permanently on one of your servers. Once you get started with it, you might quickly discover more options for the new Logagent plugins e.g. store the Alerts in Elasticsearch in parallel to Slack, re-index aggregated and transformed data or watch simply log files or unix pipes to create real time Slack notifications without having Elasticsearch involved at all.

If you have suggestions for improvements or like to contribute to Logagent meet us on Github.

Source

The World’s Largest Library Catalog

Use WorldCat tools

Keep library resources close at hand from your browser or personalized Web page. Add the world’s libraries to your mobile and FB apps.

Use WorldCat tools

Get citations of library materials in five common styles, and export them to a variety of formats including EndNote, Reference Manager and RefWorks.

Watch our tutorial

Video tutorial

How to use WorldCat citations (YouTube)

Most popular librariesWorldCat libraries most often selected as a favorite by WorldCat users

Most popular tags

vermont college of fine arts reference available online fr tom books ucbchem espaã±ol keep sell read3224 own jack schwem cds teoría cello music collections at uncg genealogy history social emotional deaccession art children documentación english fiction aprendizaje portugal romance documentacion horror estudos portugueses phil historical fiction luigi silva cello music collection music literature diabetes and you naheed ali literatura luigi silva documentación vcfa bnf mystery marjane satrapi traducción education fantasy traducción bookshelf_1 biography jds pepperdine law dvds traduccion non-fiction werelate klassik poetry science portuguese studies

Source

Microsoft is rebuilding its Edge browser on Chrome and bringing it to the Mac

Microsoft is announcing some significant changes to its Edge browser today. The software giant is beginning to rebuild Microsoft Edge to run on Chromium, the same open-source web rendering engine that powers Google’s Chrome browser. This means Edge will soon be powered by Blink and the V8 JavaScript engines. It’s a big move that means Microsoft is joining the open-source community in a much bigger way for the web.

“Ultimately, we want to make the web-experience better for many different audiences,” explains Joe Belfiore, corporate vice president of Windows. “People using Microsoft Edge (and potentially other browsers) will experience improved compatibility with all web sites, while getting the best-possible battery life and hardware integration on all kinds of Windows devices.”

Microsoft Edge isn’t going away, nor is the brand name. If you already use Edge on Windows, then that won’t change. All you’ll ultimately notice is that websites will render more consistently once Microsoft makes this under-the-hood change.

Windows 10 review photosMicrosoft’s Edge browser.Photo by Chris Welch / The Verge Edge is coming to Windows 7, Windows 8, and the Mac

So why is Microsoft changing its rendering engine? Why now? Edge has fallen massively behind Chrome in terms of market share, and it’s getting to the point where Chrome is the new IE6. Developers are optimizing for Chrome, and Google has also been creating Chrome-only web services because it’s often the first to adopt emerging web technologies. Microsoft has struggled to keep its Edge rendering engine in stride with Chromium.

The Verge understands Microsoft has been considering this move for at least a year, and a lot of the push has been from consumers and businesses who wanted the company to improve web compatibility. Edge has been improving on this front, but even small compatibility issues have caused headaches for users along the way. A move to Chromium will immediately solve these web compatibility issues, and it aligns Edge with Chrome and other browsers that also use Blink.

Microsoft has also heard loud and clear from businesses that want the company to support a modern Edge browser across all versions of Windows. Many businesses have machines running Windows 7 and Windows 10, in a mixed environment. As a result, Microsoft is bringing Edge to Windows 7 and Windows 8, decoupling it from being exclusive to Windows 10. Edge will become a downloadable executable across all supported versions of Windows, and it means Microsoft will be able to update it far more frequently than before. It’s not clear if this will be monthly, but it will certainly not be tied to every major Windows 10 update anymore.

Another big part of overhauling Edge involves developers. A lot of web developers use a Mac to develop and test sites, but Edge doesn’t exist there, and it’s currently difficult to test Microsoft’s web rendering engine on a Mac without dual booting Windows. Microsoft is now bringing Edge to the Mac. We understand it’s not a move designed to grab more market share specifically; it’s more about making it easier for developers to test Edge. Microsoft hasn’t committed to a specific date for Edge on the Mac, but we expect to see it later next year.

Windows 10 Cortana EaterChrome on Windows will get better with Microsoft’s help

All of this work means that, ultimately, the browser engine that powers Chrome will get better on Windows. Microsoft is committing to contribute web platform enhancements that will improve both Edge and Chrome on Windows, including things like touch performance, accessibility features, and support for ARM-based versions of Windows. Microsoft has been working closely with Google engineers to help support a native version of Chrome on Windows for ARM, and this will now be available soon as a result of that work.

Microsoft is only just starting to disclose this platform shift to other companies involved in the Chromium project, and the company isn’t ready to start distributing daily builds of Edge running with Chromium just yet. Those beta builds will start early next year, before Microsoft makes the necessary changes in Windows 10 to shift Edge toward Chromium. We expect to see Windows 10 move to this Chromium-based version of Edge sometime in 2019.

Microsoft now wants to collaborate with Apple, Google, and everyone else who also commits changes to Chromium. “If you’re part of the open-source community developing browsers, we invite you to collaborate with us as we build the future of Microsoft Edge and contribute to the Chromium project,” says Belfiore. “We are excited about the opportunity to be an even-more-active part of this community and bring the best of Microsoft forward to continue to make the web better for everyone.”

Source