homebody's blog

[Python][TensorFlow] 03. 선형회귀 TensorFlow로 구현 본문

Python/TensorFlow

[Python][TensorFlow] 03. 선형회귀 TensorFlow로 구현

homebody 2019. 6. 18. 22:26

TensorFlow(Simple Linear Regression을TensorFlow로 구현)

x_data = [1,2,3,4,5]
y_data = [1,2,3,4,5]

W = tf.Variable(2.9)
b = tf.Variable(0.5)

# hypothesis = W * x + b
hypothesis = W * x_data + b

cost = tf.reduce_mean(tf.square(hypothesis - y_data))
  • Cost가 최소가 되는 W, b를 찾는 알고리즘

    • Gradient descent(경사하강알고리즘 or 경사하강법)

      # Learning_rate initialize
      learning_rate = 0.01
      
      # Gradient descent
      with tf.GradientTape() as tape:
          hypothesis = W * x_data + b
          cost = tf.reduce_mean(tf.square(hypothesis - y_data))
      
      W_grad, b_grad = tape.gradient(cost, [W, b])
      
      W.assign_sub(learning_rate * W_grad)
      b.assign_sub(learning_rate * b_grad)
      
    • Gradient descent 전체 코드(경사하강알고리즘 or 경사하강법)

      import tensorflow as tf
      tf.enable_eager_execution()
      
      # Data
      x_data = [1,2,3,4,5]
      y_data = [1,2,3,4,5]
      
      # W, b initialize
      W = tf.Variable(2.9)
      b = tf.Variable(0.5)
      
      learning_rate = 0.01
      
      for i in range(100+1): # W, b update
          # Gradient descent
          with tf.GradientTape() as tape:
          	hypothesis = W * x_data + b
          	cost = tf.reduce_mean(tf.square(hypothesis - y_data))
      
      	W_grad, b_grad = tape.gradient(cost, [W, b])
          W.assign_sub(learning_rate * W_grad)
          b.assign_sub(learning_rate * b_grad)
          if i % 10 == 0:
              print("{:5}|{:10.4f}|{:10.4}|{:10.6f}".format(i, W.numpy(), b.numpy(), cost))
      
Comments