IDC에서 AWS로 Redis 데이터 이전하기

2c8948aaaa1a7047e0dc3cd869877879?s=47 mingrammer
February 28, 2019

IDC에서 AWS로 Redis 데이터 이전하기

Redis 데이터 이전기: https://mingrammer.com/redis-migration/

2c8948aaaa1a7047e0dc3cd869877879?s=128

mingrammer

February 28, 2019
Tweet

Transcript

  1. IDCীࢲ AWS۽ Redis ؘ੉ఠ ੉੹ೞӝ AWSKRUG Architecture | 2019.02.28

  2. Name ӂ޹੤ (MinJae Kwon) Nickname @mingrammer Email mingrammer@gmail.com Who Game

    Server Engineer @ SundayToz Blog https://mingrammer.com Facebook https://facebook.com/mingrammer Github https://github.com/mingrammer Eng Blog https://medium.com/@mingrammer
  3. … … … … …

  4. … … … … … Amazon
 RDS Amazon 
 ElastiCache

  5. … … … … … • Redis ࢲߡ: 37؀ (Redis

    2.6 / 2.8 ഒਊ) • য়ߡ೻٘ܳ ઁ৻ೠ ୨ ࣽࣻ ؘ੉ఠ ਊ۝: 20GB ੹റ • ୨ ః ѐࣻ: 33M (33,000,000) ੹റ • ࠙࢑ػ ఃٜ਷ ݽف ೠ Ҕਵ۽ ా೤
  6. … … Public Outbound X Public Inbound X Multi Master

    X Key migration Merge redis dbs
  7. … … Public Outbound X Public Inbound X Multi Master

    X Key migration Merge redis dbs Proxy? Merge?
  8. … Accessible VPN SSH Key migration

  9. Key migration … Accessible VPN SSH Use desktop (1 iMac

    with 24GB RAM) as a proxy
  10. Merge redis des • Merge 37 rdb files into a

    single rdb file • Dump & restore for all keys
  11. Merge redis des • Merge 37 rdb files into a

    single rdb file • Dump & restore for all keys There are no official / 3rd party rdb merge tools
  12. Merge redis des • Merge 37 rdb files into a

    single rdb file • Dump & restore for all keys
  13. … 37 redis servers in IDC 1 iMac with 24GB

    RAM … :6400 :6401 :6402 :6436 192.x.x.x 192.x.x.x 192.x.x.x 192.x.x.x … 8 parallel replica Active Waiting Data Migration Architecture
  14. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  15. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  16. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  17. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  18. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  19. • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠܳ Ցযয়ח ࠗ࠙җ EC2۽ ؘ੉ఠܳ ੉੹ೞח ࠗ࠙ਸ ೞա੄

    ੘সਵ۽ ޘ਺ • ਤ ੘সਸ ࣻ೯ೞח ೐۽ࣁझܳ 8ѐ ڸ਑ • 37ѐ੄ ۽ஸ ನ౟ܳ ળ࠺ೠ ٍ п ੘স߹۽ ನ౟ܳ ೡ׼೧ ۽ஸ Redis ࢲߡܳ ڸ਑ (localhost:64xx) • ୭؀ 8ѐ੄ ۽ஸ Redis ࢲߡח пп IDC Redis ࢲߡ৬ ۨ೒ܻா੉࣌ਸ ݛ਺ (slaveof 192.x.x.x 6379) • ۨ೒ܻா੉࣌ਵ۽ ؘ੉ఠ زӝചо ৮ܐغݶ ۨ೒ܻா੉࣌ਸ ՐҊ EC2۽ ؘ੉ఠ ੉੹ਸ द੘ • EC2۽੄ ؘ੉ఠ ੉੹੉ ৮ܐغݶ ۽ஸ Redis ࢲߡ ࣐׮਍ ߂ ؒ೐ ౵ੌ ࢏ઁറ ೐۽ࣁझ ߈ജ Data Migration Process
  20. def run(port, server): local_server = 'localhost:{}'.format(port) fetch(server, local_server) migrate_all(local_server, 'ec2

    server ip') shutdown(local_server) clear_dump(port) def main(start, end): ... pool = Pool(processes=8) pool.starmap(run, zip(local_ports, legacy_servers)) Data Migration Process
  21. def run(port, server): local_server = 'localhost:{}'.format(port) fetch(server, local_server) migrate_all(local_server, 'ec2

    server ip') shutdown(local_server) clear_dump(port) def main(start, end): ... pool = Pool(processes=8) pool.starmap(run, zip(local_ports, legacy_servers)) Data Migration Process $ ls -l ... dump6400.rdb # زӝച ৮ܐػ ۽ஸ ࢲߡ ನ౟. EC2 ੉੹ө૑ ৮ܐغݶ rdb ࢏ઁ ... dump6409.rdb ... dump6405.rdb ... dump6406.rdb ... temp_6401.rdb # زӝച ૓೯઺ੋ ۽ஸ ࢲߡ ನ౟. ... temp_6408.rdb ... temp_6403.rdb ... temp_6402.rdb
  22. # src: IDC ࢲߡ # dst: ۽ஸ ࢲߡ def fetch(src,

    dst): ... r = redis.StrictRedis(host=dsthost, port=dstport, charset='utf8') try: r.ping() except redis.exceptions.ConnectionError: subprocess.call([ 'redis-server', '--port', dstport, '--dbfilename', 'dump{}.rdb'.format(dstport), '--daemonize', 'yes', ], stdout=subprocess.DEVNULL) print('[{}] [{}|{}] Slave status: {}'.format(now(), src, dst, r.slaveof(srchost, srcport))) ... Data Migration Process (Synchronization with Replication) Start redis server after ping test
  23. def fetch(src, dst): ... while True: master_link_status = r.info('replication')['master_link_status'] master_sync_in_progress

    = r.info('replication')['master_sync_in_progress'] if master_link_status == 'up' and master_sync_in_progress == 0: r.slaveof() break print('[{}] [{}|{}] All keys is fetched.'.format(now(), src, dst)) Data Migration Process (Synchronization with Replication) Sync and check done
  24. def migrate_all(src, dst): ... r = redis.StrictRedis(host=srchost, port=srcport, charset='utf8') keyspace

    = r.info('keyspace') print('[{}] [{}|{}] Started migrating.'.format(now(), src, dst)) jobs = [gevent.spawn(migrate, src, dst, int(k[2:])) for k in keyspace.keys()] gevent.joinall(jobs) print('[{}] [{}|{}] Migration was done.'.format(now(), src, dst)) Data Migration Process (Dump & Restore) Parallelism by keyspace
  25. def migrate(src, dst, db): count = 2500 cursor = 0

    ... while True: # 2,500ѐ ః ࣽഥ cursor, keys = srcr.scan(cursor, count=count) pipeline = srcr.pipeline(transaction=False) # dump for key in keys: pipeline.pttl(key) pipeline.dump(key) result = pipeline.execute() # restore pipeline = dstr.pipeline(transaction=False) for key, ttl, data in zip(keys, result[::2], result[1::2]): if data != None: pipeline.restore(key, ttl + 10800000 if ttl > 0 else 0, data) pipeline.execute(False) ... if cursor == 0: break Data Migration Process (Dump & Restore) Migrate all keys with pipelining
  26. def run(port, server): local_server = 'localhost:{}'.format(port) fetch(server, local_server) migrate_all(local_server, 'ec2

    server ip') shutdown(local_server) clear_dump(port) def main(start, end): ... pool = Pool(processes=8) pool.starmap(run, zip(local_ports, legacy_servers)) Data Migration Process
  27. Conclusion

  28. Conclusion ✓ ؘझ௼఑ → EC2: 23࠙ (ড 24,000 keys/s) ✓

    EC2 → ElastiCache: 14࠙ (ড 40,000 keys/s)
  29. ݃੉Ӓۨ੉࣌ ഥҊ Conclusion