(resolved) AWS and vMX100: one-way connectivity from office autoVPN (IN success, OUT fail)
edit: RESOLVED: Issue was with AWS Security Groups that someone applied to the servers.
I have an AWS routing issue when using the vMX100, but not sure. Everything worked great with the AWS VPN up. But when the vMX100 was installed (and removed the AWS VPN), we have the following results.
Traffic from our office to AWS
ping is successful (with reply from AWS servers).
Traffic to office from AWS (over autoVPN)
Ping fails. Cannot reach devices in office from AWS.
Can only ping Internet and local VPC instances.
There is no problem connecting to the AWS devices over the autoVPN, but no traffic can be initiated from AWS instances. We have devices trying to sync with other servers in the office, and they are failing.
Routing Table: all routes pointing to vMX instance (except 0.0.0.0/0 and local subnet)
NACL: wide open both ways.
AWS Security Group: default (wide open)
all auto VPN tunnels came up without issue
firewall rules are wide-open to/from the Meraki.
Thank you for any help getting this vMX to work in AWS.
edit: This is the document used for the vMX100 install to AWS.
I just had this same issue plaguing me for a few weeks. Had multiple Cisco and AWS support engineers troubleshoot with me and we could not figure it out until this morning. I wanted to post a more specific solution in case others like me find this and start tearing their hair out:
If you want your EC2 instance to be able to SSH to an on-prem device through Meraki, you need to add an inbound rule to your vMX's security group that allows SSH traffic from the VPC CIDR (or from whatever subnet/individual IP from that block).
There are autogenerated rules from creating a vMX in AWS and SSH is not one of the protocols included because Meraki does not allow SSH into itself (so there would be no need for it in general).