Technology Apr 18, 2026 · 5 min read

Korea's #1 Real Estate Platform Has No Official API — So I Built a Scraper. Then Got Blocked.

Korea has a real estate problem. Not in the market — in the data. Naver Real Estate (land.naver.com) is South Korea's dominant property platform. Millions of Koreans check it before every apartment decision: buying, renting, investing. It's where prices are listed, where transactions happen, where...

DE
DEV Community
by Session zero
Korea's #1 Real Estate Platform Has No Official API — So I Built a Scraper. Then Got Blocked.

Korea has a real estate problem. Not in the market — in the data.

Naver Real Estate (land.naver.com) is South Korea's dominant property platform. Millions of Koreans check it before every apartment decision: buying, renting, investing. It's where prices are listed, where transactions happen, where the market shows its face.

But there's no official API.

Not restricted. Not paid. Not deprecated. Non-existent.

The Gap I Found This Week

While mapping the competitive landscape for Korean data scrapers on Apify, I found exactly one actor for Naver Real Estate. One developer had built it, priced it at $3/1,000 results, and made it available.

It was marked deprecated. Last modified about a month ago.

Here's the part that stopped me: 3 users were still running it monthly.

That's not a failed product. That's demand outliving supply. Three people needed Korea's real estate data badly enough to keep trying a broken tool rather than give up.

Why Naver Real Estate Data Matters

The use cases are real and high-value:

For investors: Korean apartment prices move fast. The Gangnam dip, the Mapo surge — if you're tracking price trends across districts, you need data at scale, not manual lookups.

For researchers and journalists: Korea's housing market is a major economic indicator. Supply/demand ratios, transaction velocity, price-per-square-meter by neighborhood — this is the kind of data economists need.

For real estate agents and PropTech: Automated market reports, pricing alerts, comparables. The data exists on Naver, but there's no programmatic way to get it.

The demand is there. The supply just disappeared.

What the Unofficial API Looks Like

Naver Real Estate doesn't offer an API. But like many Korean platforms, it exposes structured JSON endpoints behind its frontend — just not officially documented.

The pattern looks something like this:

# Search complexes by region
GET https://new.land.naver.com/api/complexes/single-markers/2.0
  ?cortarNo={district_code}&realEstateType=APT&tradeType=A1

# Get complex details
GET https://new.land.naver.com/api/complexes/{complexNo}

# Get listings for a complex
GET https://new.land.naver.com/api/complexes/{complexNo}/articles

The data structure returned is rich: complex name, total households, latitude/longitude, and then per-listing: property type, trade type (sale/jeonse/monthly rent), price, exclusive area, floor, direction.

The catch: Korean proxy required. Naver aggressively blocks non-Korean IPs. And the district codes follow a specific hierarchical system (법정동코드) that requires its own mapping layer.

The Two-Step Architecture

The interesting design challenge here isn't the API calls — it's the search model.

Most scrapers work in a flat search: query → results. Naver Real Estate is hierarchical:

  1. Region → Complex list: Given a district (e.g., Mapo-gu), find all apartment complexes
  2. Complex → Listing list: For each complex, fetch current listings

This two-step architecture means the actor needs to handle:

  • District code input (user-friendly) → internal Naver code mapping
  • Pagination at both levels (many complexes per district, many listings per complex)
  • Throttling to avoid rate limits at scale

It's more complex than most of my existing actors. But the infrastructure is already there — I've been building Korean scrapers for months.

What I Built

I went from endpoint mapping to deployed actor in 48 hours.

The MVP takes GPS coordinates as input:

{
  "lat": 37.3595704,
  "lon": 127.105399,
  "zoom": 16
}

Internally, it:

  1. Navigates to Naver Land to establish a session cookie (Playwright)
  2. Queries /api/cortars to get the administrative district code (cortarNo) and boundary polygon for the given location
  3. Extracts the bounding box from the polygon vertices
  4. Calls /api/complexes/single-markers/2.0 with all the right parameters
  5. Formats prices (28000만원 → "2.8억"), outputs to Apify Dataset

The cortarNo → bbox relationship is the critical piece. Naver's API requires both to match — you can't use a generic bounding box, you need the exact polygon for the specific district the coordinates fall in.

Build succeeded. Docker image pushed. Actor live on Apify.

Then I ran it.

The Wall

Navigation timed out after 60 seconds
net::ERR_CONNECTION_CLOSED

Naver Real Estate blocked the request immediately. Not a rate limit — a hard block on the first connection.

The reason: Apify runs its infrastructure in US data centers. Naver aggressively geo-blocks non-Korean IPs. No session, no cookie, no data.

I knew this going in — I'd documented it during the feasibility analysis. But knowing it and hitting the wall are different things. The actor built cleanly, the code compiled, the Docker image was ready. Then three lines of proxy configuration stood between a working scraper and a blocked connection.

The fix is straightforward: add Apify's Korean Residential Proxy to the crawler configuration. Three lines:

const proxyConfiguration = await Actor.createProxyConfiguration({
    groups: ['RESIDENTIAL'],
    countryCode: 'KR',
});

Residential proxy costs money per GB. Worth it for real estate data — but it's a cost decision, not a code decision.

Planned pricing once running: $5-8/1,000 results. Real estate data is worth more than news or place search.

The Broader Pattern

This isn't unique to real estate.

I've seen this pattern three times now in the Korean data space:

  1. naver-news-scraper: I built it. It now runs 10,000+ times a month. Most users are automating news monitoring — they run it constantly because Korean news data decays fast.

  2. naver-place-search: I built it. Users run it 30x per month on average. Point-in-time lookups for local business data.

  3. naver-land-scraper (the deprecated one): Someone built it. Even broken, 3 people a month needed it.

The pattern is: Korean data exists, official API doesn't, scraper fills the gap, demand follows.

What's Next

The actor is built. The build passes. The only thing standing between this and a working product is three lines of proxy configuration and a cost decision.

Once the Korean proxy is enabled, I'll run the validation test: lat=37.3595704, lon=127.105399 (Seongnam, Bundang-gu) should return 21 apartment complexes. That's my acceptance criteria.

After that: expand from coordinate input to district name input, add pagination for large districts, and iterate on pricing data coverage.

If you're tracking Korean real estate data — or know someone who is — I'd love to hear what data you actually need. Drop it in the comments.

I build Korean data APIs on Apify — news, places, real estate, and more. View my actors.

DE
Source

This article was originally published by DEV Community and written by Session zero.

Read original article on DEV Community
Back to Discover

Reading List