{"id":1783,"date":"2026-05-02T03:05:58","date_gmt":"2026-05-02T03:05:58","guid":{"rendered":"https:\/\/alanova.id\/blog\/cara-scraping-google-maps-python-selenium-api\/"},"modified":"2026-05-07T03:27:19","modified_gmt":"2026-05-07T03:27:19","slug":"cara-scraping-google-maps-python-selenium-api","status":"publish","type":"post","link":"https:\/\/alanova.id\/blog\/cara-scraping-google-maps-python-selenium-api\/","title":{"rendered":"Cara Scraping Google Maps dengan Python: Selenium, API, dan Alternatif Terbaik"},"content":{"rendered":"<p>Kalau kamu seorang developer atau sudah familiar dengan Python, scraping Google Maps menggunakan script sendiri memberikan fleksibilitas yang tidak bisa ditawarkan tools no-code manapun: kamu bisa atur sepenuhnya logika filtering, format output, dan integrasi ke sistem yang sudah ada.<\/p>\n<p>Tapi ada tantangan teknis yang perlu dipahami dulu. Kalau tujuannya hanya mengumpulkan leads bisnis untuk outreach, <a href=\"https:\/\/alanova.id\/blog\/cara-scraping-data-bisnis-google-maps-tanpa-coding-2025\/\">cara scraping Google Maps tanpa coding<\/a> hampir selalu lebih efisien untuk non-developer. Panduan ini untuk yang memang butuh pendekatan programatik.<\/p>\n<h2>Mengapa Scraping Google Maps dengan Python Lebih Kompleks dari yang Terlihat<\/h2>\n<ul>\n<li><strong>JavaScript rendering.<\/strong> Konten Maps di-render secara dinamis \u2014 requests biasa hanya mendapat HTML kosong. Butuh browser automation seperti Selenium atau Playwright.<\/li>\n<li><strong>Anti-bot detection.<\/strong> Google mendeteksi pola akses tidak natural dan memblokir IP atau menampilkan CAPTCHA setelah beberapa request berturut-turut.<\/li>\n<li><strong>Lazy loading.<\/strong> Hasil Maps di-load bertahap saat di-scroll \u2014 butuh logika scroll otomatis untuk dapat lebih dari 20 hasil pertama.<\/li>\n<li><strong>Struktur yang berubah.<\/strong> Google update interface Maps berkala, selector CSS di script kamu bisa break kapan saja tanpa peringatan.<\/li>\n<\/ul>\n<h2>Opsi 1: Selenium + Python (Browser Automation)<\/h2>\n<p>Selenium mengotomasi browser Chrome untuk berinteraksi dengan Maps seperti yang dilakukan manusia.<\/p>\n<h3>Install Dependencies<\/h3>\n<pre><code>pip install selenium webdriver-manager<\/code><\/pre>\n<h3>Contoh Script Dasar<\/h3>\n<pre><code>from selenium import webdriver\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.chrome.service import Service\nfrom webdriver_manager.chrome import ChromeDriverManager\nimport time, csv\n\ndef scrape_maps(query, max_scroll=10):\n    options = webdriver.ChromeOptions()\n    options.add_argument('--headless')\n    options.add_argument('--no-sandbox')\n    options.add_argument('--disable-dev-shm-usage')\n    options.add_argument('--user-agent=Mozilla\/5.0 (Windows NT 10.0; Win64; x64)')\n\n    driver = webdriver.Chrome(\n        service=Service(ChromeDriverManager().install()),\n        options=options\n    )\n\n    url = f\"https:\/\/www.google.com\/maps\/search\/{query.replace(' ', '+')}\"\n    driver.get(url)\n    time.sleep(3)\n\n    results = []\n    try:\n        feed = driver.find_element(By.CSS_SELECTOR, 'div[role=\"feed\"]')\n        for _ in range(max_scroll):\n            cards = driver.find_elements(By.CSS_SELECTOR, 'div.Nv2PK')\n            for card in cards:\n                try:\n                    name = card.find_element(By.CSS_SELECTOR, '.qBF1Pd').text\n                    results.append({'name': name})\n                except:\n                    pass\n            driver.execute_script(\n                \"arguments[0].scrollTop = arguments[0].scrollHeight\", feed\n            )\n            time.sleep(2)\n    finally:\n        driver.quit()\n\n    return results\n\n# Contoh penggunaan\ndata = scrape_maps(\"klinik gigi Surabaya\", max_scroll=15)\nprint(f\"Terkumpul: {len(data)} listing\")<\/code><\/pre>\n<p><strong>Catatan:<\/strong> CSS selector seperti <code>.qBF1Pd<\/code> dan <code>.Nv2PK<\/code> akan berubah saat Google update Maps. Ini adalah maintenance overhead utama metode ini.<\/p>\n<h2>Opsi 2: Playwright (Lebih Modern dari Selenium)<\/h2>\n<p>Playwright adalah alternatif yang lebih stabil dengan async support yang lebih baik.<\/p>\n<pre><code>pip install playwright\nplaywright install chromium<\/code><\/pre>\n<pre><code>from playwright.sync_api import sync_playwright\nimport time\n\ndef scrape_playwright(query):\n    with sync_playwright() as p:\n        browser = p.chromium.launch(headless=True)\n        page = browser.new_page()\n\n        page.goto(f\"https:\/\/www.google.com\/maps\/search\/{query.replace(' ', '+')}\")\n        page.wait_for_load_state('networkidle')\n\n        results = []\n        # Scroll dan extract listing...\n        # Logika serupa dengan Selenium\n\n        browser.close()\n        return results<\/code><\/pre>\n<h2>Opsi 3: Google Places API (Resmi, Paling Reliable)<\/h2>\n<p>Cara paling reliable: menggunakan API resmi Google. Data terstruktur, tidak akan break, tapi ada biaya per request.<\/p>\n<pre><code>pip install googlemaps<\/code><\/pre>\n<pre><code>import googlemaps, time\n\ngmaps = googlemaps.Client(key='YOUR_API_KEY')\n\ndef search_places(kw, loc, max_results=60):\n    geo = gmaps.geocode(loc)\n    if not geo:\n        return []\n\n    lat = geo[0]['geometry']['location']['lat']\n    lng = geo[0]['geometry']['location']['lng']\n\n    resp = gmaps.places_nearby(\n        location=(lat, lng),\n        keyword=kw,\n        radius=10000\n    )\n    results = list(resp.get('results', []))\n\n    while 'next_page_token' in resp and len(results) < max_results:\n        time.sleep(2)  # Wajib delay sebelum pakai page token\n        resp = gmaps.places_nearby(page_token=resp['next_page_token'])\n        results.extend(resp.get('results', []))\n\n    return results\n\ndata = search_places(\"dokter gigi\", \"Bandung\")\nprint(f\"Total: {len(data)} hasil\")<\/code><\/pre>\n<p><strong>Biaya:<\/strong> Nearby Search $0.032\/request, maks 20 hasil per request. Untuk 1.000 data estimasi $1.6\u20135 USD.<\/p>\n<h2>Opsi 4: SerpAPI (Paling Mudah, Berbayar)<\/h2>\n<p>SerpAPI menangani semua kompleksitas di server mereka dan mengembalikan data terstruktur.<\/p>\n<pre><code>pip install google-search-results<\/code><\/pre>\n<pre><code>from serpapi import GoogleSearch\n\nparams = {\n    \"engine\": \"google_maps\",\n    \"q\": \"klinik gigi\",\n    \"location\": \"Surabaya, Indonesia\",\n    \"api_key\": \"YOUR_KEY\"\n}\n\nsearch = GoogleSearch(params)\nresults = search.get_dict().get('local_results', [])\nprint(f\"Hasil: {len(results)}\")<\/code><\/pre>\n<p><strong>Harga:<\/strong> mulai $50\/bulan untuk 5.000 pencarian.<\/p>\n<h2>Perbandingan Semua Metode<\/h2>\n<table>\n<thead>\n<tr>\n<th>Metode<\/th>\n<th>Biaya<\/th>\n<th>Reliability<\/th>\n<th>Maintenance<\/th>\n<th>Setup Time<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td>Selenium \/ Playwright<\/td>\n<td>Gratis<\/td>\n<td>Rendah (sering break)<\/td>\n<td>Tinggi<\/td>\n<td>1\u20132 hari<\/td>\n<\/tr>\n<tr>\n<td>Google Places API<\/td>\n<td>Per request<\/td>\n<td>Sangat tinggi<\/td>\n<td>Hampir nol<\/td>\n<td>Beberapa jam<\/td>\n<\/tr>\n<tr>\n<td>SerpAPI<\/td>\n<td>$50+\/bulan<\/td>\n<td>Tinggi<\/td>\n<td>Nol<\/td>\n<td>1 jam<\/td>\n<\/tr>\n<tr>\n<td>Alanova (no-code)<\/td>\n<td>Free plan tersedia<\/td>\n<td>Tinggi<\/td>\n<td>Nol<\/td>\n<td>15 menit<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>Python vs No-Code: Kapan Pilih Mana?<\/h2>\n<p><strong>Pilih Python kalau:<\/strong><\/p>\n<ul>\n<li>Butuh integrasi langsung ke database atau pipeline internal<\/li>\n<li>Ada transformasi data custom yang sangat spesifik<\/li>\n<li>Volume sangat besar dengan kebutuhan real-time<\/li>\n<li>Tim ada developer yang bisa handle maintenance<\/li>\n<\/ul>\n<p><strong>Pilih <a href=\"https:\/\/alanova.id\/\">Alanova<\/a> (no-code) kalau:<\/strong><\/p>\n<ul>\n<li>Tidak punya background teknis atau tidak mau urusan coding<\/li>\n<li>Butuh hasil cepat \u2014 setup 15 menit langsung jalan<\/li>\n<li>Tujuan utamanya leads untuk outreach sales<\/li>\n<li>Butuh ekstraksi email dari website bisnis (Alanova lakukan ini otomatis)<\/li>\n<li>Tidak mau maintenance saat Google update interface Maps<\/li>\n<\/ul>\n<blockquote>\n<p>Pro tip: Sebelum investasi waktu untuk setup script Python, hitung total cost of ownership: waktu setup awal, waktu maintenance setiap kali script break, waktu debugging. Untuk sebagian besar kebutuhan sales leads, angka ini hampir selalu lebih besar dari biaya berlangganan tools yang sudah production-ready.<\/p>\n<\/blockquote>\n<p>Butuh jasa scraping data Google Maps? Lihat perbandingan semua opsi di <a href=\"https:\/\/alanova.id\/blog\/jasa-scraping-data-google-maps-indonesia\/\">panduan jasa scraping data Google Maps Indonesia<\/a>.<\/p>\n<h2>Baca Juga<\/h2>\n<ul>\n<li><a href=\"https:\/\/alanova.id\/blog\/cara-scraping-data-bisnis-google-maps-tanpa-coding-2025\/\">Cara Scraping Data Google Maps Tanpa Coding: Panduan Lengkap<\/a><\/li>\n<li><a href=\"https:\/\/alanova.id\/blog\/tools-scraping-google-maps-gratis-indonesia\/\">Tools Scraping Google Maps Gratis<\/a><\/li>\n<li><a href=\"https:\/\/alanova.id\/blog\/cara-scraping-data-google-maps-massal-strategi-volume-besar\/\">Cara Scraping Google Maps Volume Besar<\/a><\/li>\n<li><a href=\"https:\/\/alanova.id\/blog\/scraping-google-maps-legal-indonesia\/\">Apakah Scraping Google Maps Legal?<\/a><\/li>\n<\/ul>\n<p><script type=\"application\/ld+json\">\n{\n  \"@context\": \"https:\/\/schema.org\",\n  \"@type\": \"FAQPage\",\n  \"mainEntity\": [\n    {\n      \"@type\": \"Question\",\n      \"name\": \"Apakah bisa scraping Google Maps dengan Python?\",\n      \"acceptedAnswer\": {\n        \"@type\": \"Answer\",\n        \"text\": \"Ya, bisa menggunakan Selenium, Playwright, Google Places API, atau SerpAPI. Google Places API adalah cara resmi yang paling reliable meski ada biaya per request. Selenium dan Playwright gratis tapi butuh maintenance karena Google sering mengubah struktur halaman Maps.\"\n      }\n    },\n    {\n      \"@type\": \"Question\",\n      \"name\": \"Berapa biaya scraping Google Maps via API resmi?\",\n      \"acceptedAnswer\": {\n        \"@type\": \"Answer\",\n        \"text\": \"Google Places API mengenakan $0.032 per Nearby Search request dengan maksimal 20 hasil per request. Untuk 1.000 data, estimasi biaya $1.6-5 USD tergantung jumlah halaman yang di-paginate.\"\n      }\n    },\n    {\n      \"@type\": \"Question\",\n      \"name\": \"Library Python apa yang paling bagus untuk scraping Google Maps?\",\n      \"acceptedAnswer\": {\n        \"@type\": \"Answer\",\n        \"text\": \"Untuk reliability terbaik: Google Places API via library googlemaps (resmi, tidak akan break). Untuk yang mau gratis: Selenium atau Playwright, tapi butuh maintenance rutin. Untuk kemudahan maksimal tanpa coding sama sekali: Alanova tersedia dengan free plan.\"\n      }\n    }\n  ]\n}\n<\/script><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Panduan teknis scraping Google Maps dengan Python: perbandingan Selenium, Playwright, Google Places API, SerpAPI, lengkap contoh kode dan kapan sebaiknya pakai tools no-code.<\/p>\n","protected":false},"author":1,"featured_media":1781,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[],"class_list":["post-1783","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-scraping"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/posts\/1783","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/comments?post=1783"}],"version-history":[{"count":1,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/posts\/1783\/revisions"}],"predecessor-version":[{"id":2013,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/posts\/1783\/revisions\/2013"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/media\/1781"}],"wp:attachment":[{"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/media?parent=1783"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/categories?post=1783"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/alanova.id\/blog\/wp-json\/wp\/v2\/tags?post=1783"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}